Welcome to Reality~Trivia
sep 2024
“Civil disobedience is not our problem. Our problem is civil obedience. Our problem is that people all over the world have obeyed the dictates of leaders…and millions have been killed because of this obedience…Our problem is that people are obedient allover the world in the face of poverty and starvation and stupidity, and war, and cruelty. Our problem is that people are obedient while the jails are full of petty thieves… (and) the grand thieves are running the country. That’s our problem.”
― Howard Zinn
-------------------------------------------------------
the apocalypse of settler colonialism
by dr. gerald horne
"through the centuries, the Republic that eventuated in North America has maintained a maximum of chutzpah and a minimum of self-awareness in forging a creation myth that sees slavery and dispossession not as foundational but inimical to the founding of the nation now know as the United States. But, of course, to confront the ugly reality would induce a persistent sleeplessness interrupted by haunted drems, so thus this unsteadiness has prevailed."
Oliver Stone & Peter Kuznick : The Untold History of the United States
...Why do such a tiny number of people - whether the figure is currently 300 or 500 or 2,000 - control more wealth than the world's poorest 3 billion? Why are a tiny minority of wealthy Americans allowed to exert so much control over U.S. domestic politics, foreign policy, and media while the great masses see a diminution of their real power and standards of living? Why have Americans submitted to levels of surveillance, government intrusion, abuse of civil liberties, and loss of privacy that would have appalled the Founding Fathers and earlier generations? Why does the United States have a lower percentage of unionized workers than any other advanced industrial democracy? Why, in our country, are those who are driven by personal greed and narrow self-interest empowered over those who extol social values like kindness, generosity, compassion, sharing, empathy, and community building? And why has it become so hard for the great majority of Americans to imagine a different, we would say a better, future than the one defined by current policy initiatives and social values....
the two party system
ANCIENT ROME STILL DEFINES US POLITICS OF WAR AND POVERTY
BY JACQUELINE MARCUS, TRUTHOUT
To see that our political system has not evolved from the days of Rome, morally, ethically, legally, environmentally, culturally; that we still live in an age of wars; that our political policies still operate under an archaic system that creates dire poverty and desolation for the masses of people -- a system that exploits resources with no regard for the pollution it creates in order to benefit the few; that this political system has prevailed over the course of 2,000 years or more is quite stunning, indeed, embarrassing, when you think about it.
articles
US health system ranks last compared with peer nations, report finds
(ARTICLE BELOW)
Poor People Are Business Owners, Too – But Myths Around Poverty and Entrepreneurship Hold Them Back(ARTICLE BELOW)
Higher prices, lower turnover, more workers: The reality of California's $20 fast-food minimum wage(ARTICLE BELOW)
Debunking the myth that 'inflation is caused by wage increases and too much government spending'
(ARTICLE BELOW)
ROBERT REICH DEBUNKS THE MYTH THAT 'THE RICH DESERVE TO BE RICH'
(ARTICLE BELOW)
SHATTERING DECEPTIVE MIRRORS: YOUNGER GENERATIONS HAVE THE CHANCE TO BUCK THE BEAUTY INDUSTRY SCAM
(ARTICLE BELOW)
BOTTLED WATER CONTAINS HUNDREDS OF THOUSANDS OF PLASTIC BITS: STUDY
(ARTICLE BELOW)
MEDICARE ADVANTAGE PLANS: THE HIDDEN DANGERS AND THREATS TO PATIENT CARE
(ARTICLE BELOW)
A NEW STUDY DESCRIBES IN GROTESQUE DETAIL THE EXTENT TO WHICH THE ULTRARICH HAVE PERVERTED THE CHARITABLE GIVING INDUSTRY.
(ARTICLE BELOW)
HOW TRUMP AND BUSH TAX CUTS FOR BILLIONAIRES BROKE AMERICA
(ARTICLE BELOW)
FROM 1947 TO 2023: RETRACING THE COMPLEX, TRAGIC ISRAELI-PALESTINIAN CONFLICT
(ARTICLE BELOW)
RED STATE CONSERVATIVES ARE DYING THANKS TO THE PEOPLE THEY VOTE FOR
REALITY(ARTICLE BELOW)
HOW TEXAS BECAME THE NEW "HOMEBASE" FOR WHITE NATIONALIST AND NEO-NAZI GROUPS
AMERICA(ARTICLE BELOW)
RED STATE CONSERVATIVES ARE DYING THANKS TO THE PEOPLE THEY VOTE FOR
(ARTICLE BELOW)
HOW THE GOP SUCKERED AMERICA ON TAX CUTS
(ARTICLE BELOW)
'MISLEADING': ALARM RAISED ABOUT MEDICARE ADVANTAGE 'SCAM'
(ARTICLE BELOW)
EXCERPT: WHY ARE WE LETTING THE RED STATE WELFARE OLIGARCHS MOOCH OFF BLUE STATES?
(ARTICLE BELOW)
AMERICA'S "SYSTEMIC RACISM" ISN'T JUST DOMESTIC: CONSIDER WHO DIES AROUND THE WORLD IN OUR WARS(ARTICLE BELOW)
THE DEBT LIMIT IS JUST ONE OF AMERICA’S SIX WORST TRADITIONS
(ARTICLE BELOW)
*REFINED CARBS AND RED MEAT DRIVING GLOBAL RISE IN TYPE 2 DIABETES, STUDY SAYS
(ARTICLE BELOW)
*A NEW STUDY LINKS 45 HEALTH PROBLEMS TO "FREE SUGAR." HERE'S WHAT THAT MEANS, AND HOW TO AVOID IT(ARTICLE BELOW)
*REPUBLICAN POLICIES ARE KILLING AMERICANS: STUDY
(ARTICLE BELOW)
*THE CORPORATE NARRATIVE ON INFLATION IS BOGUS
(ARTICLE BELOW)
*US FOR-PROFIT HEALTH SYSTEM IS A MASS KILLER
(ARTICLE BELOW)
*REPUBLICAN COUNTIES HAVE HIGHER MORTALITY RATES THAN DEMOCRATIC ONES, STUDY FINDS
(ARTICLE BELOW)
*BOTTLED WATER GIANT BLUETRITON ADMITS CLAIMS OF RECYCLING AND SUSTAINABILITY ARE “PUFFERY”(ARTICLE BELOW)
*“IF YOU’RE GETTING A W-2, YOU’RE A SUCKER”
(ARTICLE BELOW)
*POOREST US COUNTIES SUFFERED TWICE THE COVID DEATHS OF THE RICHEST
(ARTICLE BELOW)
*LIFE EXPECTANCY LOWEST IN RED STATES -- AND THE PROBLEM IS GETTING WORSE
(ARTICLE BELOW)
*EXCERPT: THE RED STATE MURDER PROBLEM
(ARTICLE BELOW)
*'CONFLICTED CONGRESS': KEY FINDINGS FROM INSIDER'S FIVE-MONTH INVESTIGATION INTO FEDERAL LAWMAKERS' PERSONAL FINANCES(ARTICLE BELOW)
*WE COULD VACCINATE THE WORLD 3 TIMES OVER IF THE RICH PAID THE TAXES THEY OWE
(ARTICLE BELOW)
*THE SEEDY CRIMES OF THE OBSCENELY RICH ARE ROUTINELY IGNORED
(ARTICLE BELOW)
*THE 'JOB CREATORS' FANTASY IS A MALIGNANT MYTH THAT RICH USE TO SQUEEZE THE WORKING CLASS(ARTICLE BELOW)
*THE MURDER OF THE U.S. MIDDLE CLASS BEGAN 40 YEARS AGO THIS WEEK
(ARTICLE BELOW)
*BIPARTISAN INFRASTRUCTURE BILL INCLUDES $25 BILLION IN POTENTIAL NEW SUBSIDIES FOR FOSSIL FUELS(ARTICLE BELOW)
*THERE'S A STARK RED-BLUE DIVIDE WHEN IT COMES TO STATES' VACCINATION RATES
(ARTICLE BELOW)
*EVEN BY PENTAGON TERMS, THIS WAS A DUD: THE DISASTROUS SAGA OF THE F-35
(ARTICLE BELOW)
*DON'T WORRY: IF YOU'RE CONCERNED ABOUT RISING FEDERAL DEBT -- READ THIS
(ARTICLE BELOW)
*CORPORATE CONCENTRATION IN THE US FOOD SYSTEM MAKES FOOD MORE EXPENSIVE AND LESS ACCESSIBLE FOR MANY AMERICANS(ARTICLE BELOW)
*EVEN BIDEN’S $1.9 TRILLION ISN’T NEARLY ENOUGH PANDEMIC RELIEF
(ARTICLE BELOW)
*THE ECONOMY DOES MUCH BETTER UNDER DEMOCRATS. WHY?
G.D.P., JOBS AND OTHER INDICATORS HAVE ALL RISEN MORE SLOWLY UNDER REPUBLICANS FOR NEARLY THE PAST CENTURY.(ARTICLE BELOW)
*STUDY OF 50 YEARS OF TAX CUTS FOR RICH CONFIRMS ‘TRICKLE DOWN’ THEORY IS AN ABSOLUTE SHAM(ARTICLE BELOW)
*TO REVERSE INEQUALITY, WE NEED TO EXPOSE THE MYTH OF THE ‘FREE MARKET’
(ARTICLE BELOW)
*RAPID TESTING IS LESS ACCURATE THAN THE GOVERNMENT WANTS TO ADMIT
(ARTICLE BELOW)
*Trump’s Wildly Exaggerated Help For Black Voters
(ARTICLE BELOW)
*Powerless Farmworkers Suffer Under Trump’s Anti-Migrant Policies
(ARTICLE BELOW)
*Farmers Are Plagued by Debt and Climate Crisis. Trump Has Made Things Worse.
(ARTICLE BELOW)
*THE SUPER-RICH—YOU KNOW, PEOPLE LIKE THE TRUMPS —ARE RAKING IN BILLIONS
(ARTICLE BELOW)
*Retirements, layoffs, labor force flight may leave scars on U.S economy
(ARTICLE BELOW)
*THE UGLY NUMBERS ARE FINALLY IN ON THE 2017 TRUMP TAX REWRITE
(ARTICLE BELOW)
*THE FED INVESTED PUBLIC MONEY IN FOSSIL FUEL FIRMS DRIVING ENVIRONMENTAL RACISM
(ARTICLE BELOW)
*'WHITE SUPREMACY' WAS BEHIND CHILD SEPARATIONS — AND TRUMP OFFICIALS WENT ALONG, CRITICS SAY(ARTICLE BELOW)
*N.Y. NURSES SAY USED N95 MASKS ARE BEING RE-PACKED IN BOXES TO LOOK NEW
(ARTICLE BELOW)
*HOW MUCH DOES UNION MEMBERSHIP BENEFIT AMERICA'S WORKERS?
(ARTICLE BELOW)
*American billionaires’ ties to Moscow go back decades
(ARTICLE BELOW)
*‘Eye-popping’ analysis shows top one percent gained $21 trillion in wealth since 1989 while bottom half lost $900 billion(ARTICLE BELOW)
*Here's what the US imports from Mexico
(ARTICLE BELOW)
*AMERICA’S BIGGEST LIE
(ARTICLE BELOW)
*CAPITALISM AND DEMOCRACY: THE STRAIN IS SHOWING(ARTICLE BELOW)
*ALMOST TWO-THIRDS OF PEOPLE IN THE LABOR FORCE DO NOT HAVE A COLLEGE DEGREE, AND THEIR JOB PROSPECTS ARE DIMMING(ARTICLE BELOW)
*TOP 10 WAYS THE US IS THE MOST CORRUPT COUNTRY IN THE WORLD(ARTICLE BELOW)
*reality funnies and charts(below)
US health system ranks last compared with peer nations, report finds
Despite Americans paying nearly double that of other nations, the US fares poorly in list of 10 countries
Jessica Glenza - the guardian
Thu 19 Sep 2024 00.01 EDT
The United States health system ranked dead last in an international comparison of 10 peer nations, according to a new report by the Commonwealth Fund.
In spite of Americans paying nearly double that of other countries, the system performed poorly on health equity, access to care and outcomes.
“I see the human toll of these shortcomings on a daily basis,” said Dr Joseph Betancourt, the president of the Commonwealth Fund, a foundation with a focus on healthcare research and policy.
“I see patients who cannot afford their medications … I see older patients arrive sicker than they should because they spent the majority of their lives uninsured,” said Betancourt. “It’s time we finally build a health system that delivers quality affordable healthcare for all Americans.”
However, even as high healthcare prices bite into workers’ paychecks, the economy and inflation dominate voters’ concerns. Neither Kamala Harris nor Donald Trump has proposed major healthcare reforms.
The Democratic presidential nominee has largely reframed healthcare as an economic issue, promising medical debt relief while highlighting the Biden administration’s successes, such as Medicare drug price negotiations.
The Republican presidential nominee said he has “concepts of a plan” to improve healthcare, but has made no proposals. The conservative policy agenda Project 2025 has largely proposed gutting scientific and public health infrastructure.
However, when asked about healthcare issues, voters overwhelmingly ranked cost at the top. The cost of drugs, doctors and insurance are the top issue for Democrats (42%) and Republicans (45%), according to Kaiser Family Foundation health system polling. Americans spend $4.5tn per year on healthcare, or more than $13,000 per person per year on healthcare, according to federal government data.
The Commonwealth Fund’s report is the 20th in their “Mirror, Mirror” series, an international comparison of the US health system to nine wealthy democracies including Australia, Canada, France, Germany, the Netherlands, New Zealand, the UK, Sweden and Switzerland. The foundation calls this year’s report a “portrait of a failing US health system”.
The report uses 70 indicators from across five main sectors, including access to care, health equity, care process, administrative efficiency and outcomes. The measures are derived from a survey conducted by Commonwealth as well as publicly available measures from the World Health Organization, OECD and Our World in Data.
In all but “care process” – the domain that covers issues such as reconciling medications – the US ranked as the last or penultimate nation. Presenters for Commonwealth noted the US is often “in a class of its own” far below the nearest peer nation.
“Poverty, homelessness, hunger, discrimination, substance abuse – other countries don’t make their health systems work so hard,” said Reginald D Williams II, vice-president of the fund. He said most peer nations cover more of their citizens’ basic needs. “Too many individuals in the US face a lifetime of inequity, it doesn’t have to be this way.”
But recommendations to improve the US health system’s standing among peer nations will not be easy to implement.
The fund said the US would need to expand insurance coverage and make “meaningful” improvements on the amount of healthcare expenses patients pay themselves; minimize the complexity and variation in insurance plans to improve administrative efficiency; build a viable primary care and public health system; and invest in social wellbeing, rather than thrust problems of social inequity onto the health system.
“I don’t expect we will in one fell swoop rewrite the social contract,” said Dr David Blumenthal, the fund’s past president and an author of the report. “The American electorate makes choices about which direction to move in, and that is very much an issue in this election.”
RELATED: Cigna SLAPPs the FTC
Cigna, parent of PBM Express Scripts, has filed a lawsuit against the Federal Trade Commission, seeking to force the agency to retract a recent interim report depicting PBMs as “powerful middlemen inflating drug costs and squeezing main street pharmacies.”
In spite of Americans paying nearly double that of other countries, the system performed poorly on health equity, access to care and outcomes.
“I see the human toll of these shortcomings on a daily basis,” said Dr Joseph Betancourt, the president of the Commonwealth Fund, a foundation with a focus on healthcare research and policy.
“I see patients who cannot afford their medications … I see older patients arrive sicker than they should because they spent the majority of their lives uninsured,” said Betancourt. “It’s time we finally build a health system that delivers quality affordable healthcare for all Americans.”
However, even as high healthcare prices bite into workers’ paychecks, the economy and inflation dominate voters’ concerns. Neither Kamala Harris nor Donald Trump has proposed major healthcare reforms.
The Democratic presidential nominee has largely reframed healthcare as an economic issue, promising medical debt relief while highlighting the Biden administration’s successes, such as Medicare drug price negotiations.
The Republican presidential nominee said he has “concepts of a plan” to improve healthcare, but has made no proposals. The conservative policy agenda Project 2025 has largely proposed gutting scientific and public health infrastructure.
However, when asked about healthcare issues, voters overwhelmingly ranked cost at the top. The cost of drugs, doctors and insurance are the top issue for Democrats (42%) and Republicans (45%), according to Kaiser Family Foundation health system polling. Americans spend $4.5tn per year on healthcare, or more than $13,000 per person per year on healthcare, according to federal government data.
The Commonwealth Fund’s report is the 20th in their “Mirror, Mirror” series, an international comparison of the US health system to nine wealthy democracies including Australia, Canada, France, Germany, the Netherlands, New Zealand, the UK, Sweden and Switzerland. The foundation calls this year’s report a “portrait of a failing US health system”.
The report uses 70 indicators from across five main sectors, including access to care, health equity, care process, administrative efficiency and outcomes. The measures are derived from a survey conducted by Commonwealth as well as publicly available measures from the World Health Organization, OECD and Our World in Data.
In all but “care process” – the domain that covers issues such as reconciling medications – the US ranked as the last or penultimate nation. Presenters for Commonwealth noted the US is often “in a class of its own” far below the nearest peer nation.
“Poverty, homelessness, hunger, discrimination, substance abuse – other countries don’t make their health systems work so hard,” said Reginald D Williams II, vice-president of the fund. He said most peer nations cover more of their citizens’ basic needs. “Too many individuals in the US face a lifetime of inequity, it doesn’t have to be this way.”
But recommendations to improve the US health system’s standing among peer nations will not be easy to implement.
The fund said the US would need to expand insurance coverage and make “meaningful” improvements on the amount of healthcare expenses patients pay themselves; minimize the complexity and variation in insurance plans to improve administrative efficiency; build a viable primary care and public health system; and invest in social wellbeing, rather than thrust problems of social inequity onto the health system.
“I don’t expect we will in one fell swoop rewrite the social contract,” said Dr David Blumenthal, the fund’s past president and an author of the report. “The American electorate makes choices about which direction to move in, and that is very much an issue in this election.”
RELATED: Cigna SLAPPs the FTC
Cigna, parent of PBM Express Scripts, has filed a lawsuit against the Federal Trade Commission, seeking to force the agency to retract a recent interim report depicting PBMs as “powerful middlemen inflating drug costs and squeezing main street pharmacies.”
Poor People Are Business Owners, Too – But Myths Around Poverty and Entrepreneurship Hold Them Back
Michael H. Morris, University of Notre Dame - dc report
September 4, 2024
Nearly 1 in 5 people in the world lives in poverty. Even in many developed countries such as the U.S., poverty rates exceed 12%. In an age of breathtaking technological progress and dynamic social change, poverty remains stubbornly persistent. As a professor of entrepreneurship, I’m interested in a critical question: Can people in poverty create their own path to prosperity? In other words, is venture creation a viable poverty alleviation tool?
My work has shown that it can be – with the right kind of support. However, that support is often lacking.
A big part of the problem is ignorance: Most people simply don’t know much about poverty and entrepreneurship. There are plenty of myths when it comes to the ventures of the poor, due in part to the lack of hard data about the businesses of those in poverty.
These misconceptions have influenced public policy officials, economic development professionals and academics. As a result, they tend to undervalue the important economic and social role that these businesses play.
In an attempt to correct the record, here are six facts that people should know about poverty and entrepreneurship.
Fact 1: Poor People Start Businesses – Lots Of Them
It’s a myth that entrepreneurship is just for the rich. In fact, many ventures across the globe are started by people in disadvantaged circumstances – actually most of them. While hard data is difficult to come by, the evidence we do have is suggestive. For example, in some high-poverty sub-Saharan African countries, as many as two out of three adults operate or are in the process of starting their own business.
Such small businesses are arguably the backbone of many developing economies, where over 50% of the population can be in poverty. Even within developed economies, such ventures can be responsible for a meaningful component of gross domestic product.
Fact 2: Businesses Run By Poor People Create Value
Although people in poverty disproportionately create “survival businesses” that generate small profits, it’s wrong to assume that makes these ventures less valuable. Such businesses provide jobs to millions of impoverished people, representing an economic lifeline. They create value in the marketplace, filling niches that aren’t attractive to incumbent firms.
And they create more than just economic value: These businesses are embedded in the fabric of communities, providing a source of social stability. They pay taxes and can produce spillover benefits such as reduced crime, increased school completion rates and community pride.
Fact 3: Entrepreneurship Can Help Alleviate Poverty
A growing body of research suggests that higher levels of entrepreneurship are associated with greater reductions in poverty. For example, one analysis found that areas with the highest rates of entrepreneurship among the poor demonstrated the largest reductions in poverty over a six-year period.
This shouldn’t come as a major surprise. After all, while people in poverty often create survival businesses that generate small profits, venture creation represents a critical vehicle for human capital development. People who start businesses learn how to organize production, manage cash, serve customers, set prices and coordinate logistics.
What’s more, the entrepreneurial experience can enable self-sufficiency, identity development, a sense of pride and purpose, and the ability to give back.
Fact 4: Off-the-books Businesses Have Value For Society
Poor entrepreneurs often start what economists call “informal” businesses – enterprises that aren’t registered with the government and that operate under the radar. These often attract criticism.
But while off-the-books businesses may not be legal, the informal sector represents 50% or more of the economy in many developing countries, and as much as 20% in some developed nations. It represents a vast incubator that sustains the poor as they experiment with businesses and learn. In my opinion, this hidden enterprise culture should be nurtured.
Fact 5: The Biggest Challenge Isn’t Always Lack Of Money
People often assume that the key to helping ventures of the poor is to provide more capital. But despite a clear need for funding, some entrepreneurs may not be ready to make effective use of additional money. Regardless of how motivated or hard-working they are, the core issue for entrepreneurs is the ability to convert means into ends.
When an entrepreneur lacks key capabilities, such as bookkeeping, selling or inventory management, research suggests that to be effective, funding should be coupled with other forms of support. An investment is likely to be more productive when it is tied to participation in training and mentoring programs. Access to incubators, attendance at networking events and related developmental activities also are important.
Fact 6: There’s More Than One Way To Succeed
People in the world of entrepreneurship love a big success story. It’s all about picking winners. That kind of thinking works against poor entrepreneurs, who generally start basic businesses that don’t employ novel technologies, and who often have severely limited resources.
To realize the potential of entrepreneurship, it’s worth rethinking the definition of success. For the poor, success could be getting the business established and making sales, earning a profit. It could be changing the entrepreneur’s economic circumstances, hiring employees – particularly others in poverty – or adding another location.
It could be keeping the business going some number of years, providing a kind of legacy. Other success indicators can include reducing the dependency on one’s own labor, satisfying customers and the ability to give back to the community.
In the end, success is about having a better life. And research is demonstrating how entrepreneurship can make this possible.
Venture creation is not a silver bullet. Poverty is complex, and building a sustainable business is difficult. Realizing the promise of entrepreneurship requires that we get past these myths and develop the kinds of supportive environments that level the playing field.
My work has shown that it can be – with the right kind of support. However, that support is often lacking.
A big part of the problem is ignorance: Most people simply don’t know much about poverty and entrepreneurship. There are plenty of myths when it comes to the ventures of the poor, due in part to the lack of hard data about the businesses of those in poverty.
These misconceptions have influenced public policy officials, economic development professionals and academics. As a result, they tend to undervalue the important economic and social role that these businesses play.
In an attempt to correct the record, here are six facts that people should know about poverty and entrepreneurship.
Fact 1: Poor People Start Businesses – Lots Of Them
It’s a myth that entrepreneurship is just for the rich. In fact, many ventures across the globe are started by people in disadvantaged circumstances – actually most of them. While hard data is difficult to come by, the evidence we do have is suggestive. For example, in some high-poverty sub-Saharan African countries, as many as two out of three adults operate or are in the process of starting their own business.
Such small businesses are arguably the backbone of many developing economies, where over 50% of the population can be in poverty. Even within developed economies, such ventures can be responsible for a meaningful component of gross domestic product.
Fact 2: Businesses Run By Poor People Create Value
Although people in poverty disproportionately create “survival businesses” that generate small profits, it’s wrong to assume that makes these ventures less valuable. Such businesses provide jobs to millions of impoverished people, representing an economic lifeline. They create value in the marketplace, filling niches that aren’t attractive to incumbent firms.
And they create more than just economic value: These businesses are embedded in the fabric of communities, providing a source of social stability. They pay taxes and can produce spillover benefits such as reduced crime, increased school completion rates and community pride.
Fact 3: Entrepreneurship Can Help Alleviate Poverty
A growing body of research suggests that higher levels of entrepreneurship are associated with greater reductions in poverty. For example, one analysis found that areas with the highest rates of entrepreneurship among the poor demonstrated the largest reductions in poverty over a six-year period.
This shouldn’t come as a major surprise. After all, while people in poverty often create survival businesses that generate small profits, venture creation represents a critical vehicle for human capital development. People who start businesses learn how to organize production, manage cash, serve customers, set prices and coordinate logistics.
What’s more, the entrepreneurial experience can enable self-sufficiency, identity development, a sense of pride and purpose, and the ability to give back.
Fact 4: Off-the-books Businesses Have Value For Society
Poor entrepreneurs often start what economists call “informal” businesses – enterprises that aren’t registered with the government and that operate under the radar. These often attract criticism.
But while off-the-books businesses may not be legal, the informal sector represents 50% or more of the economy in many developing countries, and as much as 20% in some developed nations. It represents a vast incubator that sustains the poor as they experiment with businesses and learn. In my opinion, this hidden enterprise culture should be nurtured.
Fact 5: The Biggest Challenge Isn’t Always Lack Of Money
People often assume that the key to helping ventures of the poor is to provide more capital. But despite a clear need for funding, some entrepreneurs may not be ready to make effective use of additional money. Regardless of how motivated or hard-working they are, the core issue for entrepreneurs is the ability to convert means into ends.
When an entrepreneur lacks key capabilities, such as bookkeeping, selling or inventory management, research suggests that to be effective, funding should be coupled with other forms of support. An investment is likely to be more productive when it is tied to participation in training and mentoring programs. Access to incubators, attendance at networking events and related developmental activities also are important.
Fact 6: There’s More Than One Way To Succeed
People in the world of entrepreneurship love a big success story. It’s all about picking winners. That kind of thinking works against poor entrepreneurs, who generally start basic businesses that don’t employ novel technologies, and who often have severely limited resources.
To realize the potential of entrepreneurship, it’s worth rethinking the definition of success. For the poor, success could be getting the business established and making sales, earning a profit. It could be changing the entrepreneur’s economic circumstances, hiring employees – particularly others in poverty – or adding another location.
It could be keeping the business going some number of years, providing a kind of legacy. Other success indicators can include reducing the dependency on one’s own labor, satisfying customers and the ability to give back to the community.
In the end, success is about having a better life. And research is demonstrating how entrepreneurship can make this possible.
Venture creation is not a silver bullet. Poverty is complex, and building a sustainable business is difficult. Realizing the promise of entrepreneurship requires that we get past these myths and develop the kinds of supportive environments that level the playing field.
Higher prices, lower turnover, more workers: The reality of California's $20 fast-food minimum wage
Since increasing the state's fast-food minimum wage from $15.50 to $20, California has added more fast-food jobs
By Ashlie D. Stevens - salon
Food Editor
Published August 27, 2024 12:55PM (EDT)
When California governor Gavin Newsom signed AB 1228 — legislation which raised the state hourly minimum wage for fast-food employees from $15.50 to $20 — into law last September, members of the fast-food industry were left with a lot of questions before the bill officially went into effect on April 1.
To address some of these, the State of California’s Department of Industrial Relations set up an FAQ board. Their material ranged from broad topics, like what constitutes a “fast-food” restaurant under the new law (the state defines it as a “limited-service restaurant” that sells food or drink for immediate consumption, and has more than 60 locations nationwide ), to the more obscure, like whether employers could simply increase the amount of meal or lodging credits administered to employees to count against the minimum wage (no).
However, it didn’t address one of fast-food employers’ biggest questions: How would they actually pay for their workforce under the new law?
While the decision was lauded by many labor activists as part of broader efforts to improve working conditions and address wage disparities, some California franchise owners began preemptively cutting employees’ hours in advance of the minimum wage hike.
For instance, two days after the bill went into effect, Business Insider reported that two Pizza Hut franchisees in the state said they “planned to scrap in-house delivery” and instead rely on third-party services, resulting in around 1,200 workers being laid off. A few days before that, in March, Alex Johnson, who owned 10 Auntie Anne’s Pretzels and Cinnabon restaurants in the San Francisco Bay area, laid off his office staff and told the Associated Press he would now rely on his parents to help with payroll and human services.
“I try to do right by my employees,” Johnson told the publication at the time. “I pay them as much as I can, but this law is really hitting our operations hard.”
Many economists and franchisees predicted AB 1228 would cause the fast-food industry in California to unequivocally crash, however, according to new state and federal employment data, California’s fast-food industry has actually added jobs every month this year — including 11,000 new jobs since the wage increase officially went into effect in April. In a release, Newsom’s office reported that since raising worker wages, every month this year has seen consistent fast-food job gains, and nearly each month has seen more jobs than the same month last year.
“What’s good for workers is good for business, and as California’s fast food industry continues booming every single month our workers are finally getting the pay they deserve,” Newsom said in a written statement. “Despite those who pedaled lies about how this would doom the industry, California’s economy and workers are again proving them wrong.”
This isn’t exactly unexpected. Michael Reich, professor of economics at the University of California-Berkeley, said history didn’t support a lot of doomsday predictions about the future of fast-food in California.
"The knock is that a minimum wage increase would lead to businesses closing, workers getting laid off, and much higher prices," Reich told Public News Service. "That's been the knock on every minimum wage increase since 1938. Indeed, a large number of studies have found that minimum wages do not reduce employment in fast food."
Yet despite the positive job growth, the implications of AB 1228 have also proven to be complicated. As expected, higher wages have contributed to an increase in menu prices across fast-food chains, with many companies passing on the cost to consumers.
This summer, the nonprofit Employment Policies Institute surveyed the owners or managers of 182 limited-service restaurants in California and 98% reported they had already raised menu prices, with 93% anticipating they will have to raise menu costs again next year to accommodate rising costs. Relatedly 92% of owners think that “raising menu prices will adversely affect customer foot traffic.”
It’s important to note that this phenomenon isn’t entirely unique to California—fast-food giants nationwide have been adjusting their pricing strategies to cope with rising labor costs, sparking what some have dubbed the “value meal wars.” Companies are fiercely competing to offer the most affordable meal deals, balancing the need to maintain profitability while attracting cost-conscious customers.
On the flip side, the higher wages have had an unexpected benefit: lower employee turnover. With a $20 per hour wage, fast-food jobs in California have become more attractive to higher-quality applicants, reportedly leading to a more stable and skilled workforce.
Joseph Bryant, the executive vice president of the Service Employees International Union — which was a major proponent of AB 1228 — recently told NBC Bay Area that the industry has not only added jobs, but “multiple franchisees have also noted that the higher wage is already attracting better job candidates, thus reducing turnover."
This stability could allow some franchises to streamline operations and improve service quality, potentially offsetting some of the increased costs. However, the pressure on franchise owners remains intense, as they navigate the fine line between staying competitive and managing new operational expenses.
To address some of these, the State of California’s Department of Industrial Relations set up an FAQ board. Their material ranged from broad topics, like what constitutes a “fast-food” restaurant under the new law (the state defines it as a “limited-service restaurant” that sells food or drink for immediate consumption, and has more than 60 locations nationwide ), to the more obscure, like whether employers could simply increase the amount of meal or lodging credits administered to employees to count against the minimum wage (no).
However, it didn’t address one of fast-food employers’ biggest questions: How would they actually pay for their workforce under the new law?
While the decision was lauded by many labor activists as part of broader efforts to improve working conditions and address wage disparities, some California franchise owners began preemptively cutting employees’ hours in advance of the minimum wage hike.
For instance, two days after the bill went into effect, Business Insider reported that two Pizza Hut franchisees in the state said they “planned to scrap in-house delivery” and instead rely on third-party services, resulting in around 1,200 workers being laid off. A few days before that, in March, Alex Johnson, who owned 10 Auntie Anne’s Pretzels and Cinnabon restaurants in the San Francisco Bay area, laid off his office staff and told the Associated Press he would now rely on his parents to help with payroll and human services.
“I try to do right by my employees,” Johnson told the publication at the time. “I pay them as much as I can, but this law is really hitting our operations hard.”
Many economists and franchisees predicted AB 1228 would cause the fast-food industry in California to unequivocally crash, however, according to new state and federal employment data, California’s fast-food industry has actually added jobs every month this year — including 11,000 new jobs since the wage increase officially went into effect in April. In a release, Newsom’s office reported that since raising worker wages, every month this year has seen consistent fast-food job gains, and nearly each month has seen more jobs than the same month last year.
“What’s good for workers is good for business, and as California’s fast food industry continues booming every single month our workers are finally getting the pay they deserve,” Newsom said in a written statement. “Despite those who pedaled lies about how this would doom the industry, California’s economy and workers are again proving them wrong.”
This isn’t exactly unexpected. Michael Reich, professor of economics at the University of California-Berkeley, said history didn’t support a lot of doomsday predictions about the future of fast-food in California.
"The knock is that a minimum wage increase would lead to businesses closing, workers getting laid off, and much higher prices," Reich told Public News Service. "That's been the knock on every minimum wage increase since 1938. Indeed, a large number of studies have found that minimum wages do not reduce employment in fast food."
Yet despite the positive job growth, the implications of AB 1228 have also proven to be complicated. As expected, higher wages have contributed to an increase in menu prices across fast-food chains, with many companies passing on the cost to consumers.
This summer, the nonprofit Employment Policies Institute surveyed the owners or managers of 182 limited-service restaurants in California and 98% reported they had already raised menu prices, with 93% anticipating they will have to raise menu costs again next year to accommodate rising costs. Relatedly 92% of owners think that “raising menu prices will adversely affect customer foot traffic.”
It’s important to note that this phenomenon isn’t entirely unique to California—fast-food giants nationwide have been adjusting their pricing strategies to cope with rising labor costs, sparking what some have dubbed the “value meal wars.” Companies are fiercely competing to offer the most affordable meal deals, balancing the need to maintain profitability while attracting cost-conscious customers.
On the flip side, the higher wages have had an unexpected benefit: lower employee turnover. With a $20 per hour wage, fast-food jobs in California have become more attractive to higher-quality applicants, reportedly leading to a more stable and skilled workforce.
Joseph Bryant, the executive vice president of the Service Employees International Union — which was a major proponent of AB 1228 — recently told NBC Bay Area that the industry has not only added jobs, but “multiple franchisees have also noted that the higher wage is already attracting better job candidates, thus reducing turnover."
This stability could allow some franchises to streamline operations and improve service quality, potentially offsetting some of the increased costs. However, the pressure on franchise owners remains intense, as they navigate the fine line between staying competitive and managing new operational expenses.
Debunking the myth that 'inflation is caused by wage increases and too much government spending'
Robert Reich - alternet
July 27, 2024
You’ve probably been told that the main causes of rising prices are wage gains and excessive government spending. Wrong.
Prices have risen and remained high — especially in critical sectors such as energy, drugs, and food — largely because giant corporations have been raising their prices to increase their profits.
They can do this because they face so little competition.
Worried about sky-high airfares and lousy service? That’s largely because airlines have merged from 12 carriers in 1980 to only four today.
Concerned about drug prices? Between 1995 and 2015, 60 leading pharmaceutical companies merged to only 10.
Upset about food costs? Four large companies now control 85 percent of beef processing, 70 percent of the pork market, and 54 percent of poultry.
Worried about grocery prices? Just three giants — Albertsons, Kroger, and Walmart — control 70 percent of the grocery sales in 167 cities.
And on and on through almost every sector of the economy, including rental housing, ad-tech, chemicals, and health care.
Monopolies can raise prices and keep them high because they don’t face enough competitors charging lower prices and grabbing consumers away.
Right now, the Federal Reserve Board has responsibility for fighting inflation. When prices rise, the Fed raises interest rates to slow the overall economy.
But slowing the economy with high interest rates causes many people to lose jobs. It keeps wages low. And it raises credit card fees as well as the costs of home loans, car loans, and every other borrowing cost. These burdens fall especially hard on people with lower incomes.
A better way to avoid inflation and lower prices would be to fight pricing power at its source: Break up monopolies with antitrust laws, so that a handful of giant companies can’t artificially raise their prices.
Instead of relying solely on the Federal Reserve Board to tame prices, we should rely on monopoly-busters at the Federal Trade Commission and the Antitrust Division of the Justice Department.
Joe Biden’s appointees at the FTC and the Antitrust Division — Lina Khan and Jonathan Kanter, respectively — have been aggressive monopoly-busters, but much more needs to be done.
Will a President Kamala Harris keep the heat on?
Matt Stoller, on the Substack competition beat, believes she is likely to. He writes that Biden’s anti-monopolists have made so much progress to date — bringing cases against Amazon, Google, and Ticketmaster; halting the merger of Kroger and Albertsons; and issuing new rules on airlines, shipping, junk fees, credit cards, hearing aids, pharmaceuticals, and data — that the momentum would be hard to slow even if she wanted to. Moreover, the public has come to expect action against monopolies.
Maybe it’s just the optimism of the moment, but I think Stoller is correct and that Harris as president would be as much an economic populist as Biden, if not more.
Prices have risen and remained high — especially in critical sectors such as energy, drugs, and food — largely because giant corporations have been raising their prices to increase their profits.
They can do this because they face so little competition.
Worried about sky-high airfares and lousy service? That’s largely because airlines have merged from 12 carriers in 1980 to only four today.
Concerned about drug prices? Between 1995 and 2015, 60 leading pharmaceutical companies merged to only 10.
Upset about food costs? Four large companies now control 85 percent of beef processing, 70 percent of the pork market, and 54 percent of poultry.
Worried about grocery prices? Just three giants — Albertsons, Kroger, and Walmart — control 70 percent of the grocery sales in 167 cities.
And on and on through almost every sector of the economy, including rental housing, ad-tech, chemicals, and health care.
Monopolies can raise prices and keep them high because they don’t face enough competitors charging lower prices and grabbing consumers away.
Right now, the Federal Reserve Board has responsibility for fighting inflation. When prices rise, the Fed raises interest rates to slow the overall economy.
But slowing the economy with high interest rates causes many people to lose jobs. It keeps wages low. And it raises credit card fees as well as the costs of home loans, car loans, and every other borrowing cost. These burdens fall especially hard on people with lower incomes.
A better way to avoid inflation and lower prices would be to fight pricing power at its source: Break up monopolies with antitrust laws, so that a handful of giant companies can’t artificially raise their prices.
Instead of relying solely on the Federal Reserve Board to tame prices, we should rely on monopoly-busters at the Federal Trade Commission and the Antitrust Division of the Justice Department.
Joe Biden’s appointees at the FTC and the Antitrust Division — Lina Khan and Jonathan Kanter, respectively — have been aggressive monopoly-busters, but much more needs to be done.
Will a President Kamala Harris keep the heat on?
Matt Stoller, on the Substack competition beat, believes she is likely to. He writes that Biden’s anti-monopolists have made so much progress to date — bringing cases against Amazon, Google, and Ticketmaster; halting the merger of Kroger and Albertsons; and issuing new rules on airlines, shipping, junk fees, credit cards, hearing aids, pharmaceuticals, and data — that the momentum would be hard to slow even if she wanted to. Moreover, the public has come to expect action against monopolies.
Maybe it’s just the optimism of the moment, but I think Stoller is correct and that Harris as president would be as much an economic populist as Biden, if not more.
Trump ran up national debt twice as much as Biden: new analysis
Neil Irwin - axios
Former President Trump ran up the national debt by about twice as much as President Biden, according to a new analysis of their fiscal track records.
Why it matters: The winner of November's election faces a gloomy fiscal outlook, with rapidly rising debt levels at a time when interest rates are already high and demographic pressure on retirement programs is rising.
By the numbers: Trump added $8.4 trillion in borrowing over a ten-year window, CRFB finds in a report out this morning.
State of play: For Trump, the biggest non-COVID drivers of higher public debt were his signature tax cuts enacted in 2017 (causing $1.9 trillion in additional borrowing) and bipartisan spending packages (which added $2.1 trillion).
Between the lines: Deficit politics may return to the forefront of U.S. policy debates next year.
What they're saying: "The next president will face huge fiscal challenges," CRFB president Maya MacGuineas tells Axios.
Why it matters: The winner of November's election faces a gloomy fiscal outlook, with rapidly rising debt levels at a time when interest rates are already high and demographic pressure on retirement programs is rising.
- Both candidates bear a share of the responsibility, as each added trillions to that tally while in office.
- But Trump's contribution was significantly higher, according to the fiscal watchdogs at the Committee for a Responsible Federal Budget, thanks to both tax cuts and spending deals struck in his four years in the White House.
By the numbers: Trump added $8.4 trillion in borrowing over a ten-year window, CRFB finds in a report out this morning.
- Biden's figure clocks in at $4.3 trillion with seven months remaining in his term.
- If you exclude COVID relief spending from the tally, the numbers are $4.8 trillion for Trump and $2.2 trillion for Biden.
State of play: For Trump, the biggest non-COVID drivers of higher public debt were his signature tax cuts enacted in 2017 (causing $1.9 trillion in additional borrowing) and bipartisan spending packages (which added $2.1 trillion).
- For Biden, major non-COVID factors include 2022 and 2023 spending bills ($1.4 trillion), student debt relief ($620 billion), and legislation to support health care for veterans ($520 billion).
- Biden deficits have also swelled, according to CRFB's analysis, due to executive actions that changed the way food stamp benefits are calculated, expanding Medicaid benefits, and other changes that total $548 billion.
Between the lines: Deficit politics may return to the forefront of U.S. policy debates next year.
- Much of Trump's tax law is set to expire at the end of 2025, and the CBO has estimated that fully extending it would increase deficits by $4.6 trillion over the next decade.
- High interest rates make the taxpayer burden of both existing and new debt higher than it was during the era of near-zero interest rates.
- And the Social Security trust fund is rapidly hurtling toward depletion in 2033, which would trigger huge cuts in the retirement benefits absent Congressional action.
What they're saying: "The next president will face huge fiscal challenges," CRFB president Maya MacGuineas tells Axios.
- "Yet both candidates have track records of approving trillions in new borrowing even setting aside the justified borrowing for COVID, and neither has proposed a comprehensive and credible plan to get the debt under control," she said.
- "No president is fully responsible for the fiscal challenges that come along, but they need to use the bully pulpit to set the stage for making some hard choices," MacGuineas said.
Robert Reich debunks the myth that 'the rich deserve to be rich'
Robert Reich - alternet
June 14, 2024
Don’t be fooled by the myth that people are paid what they’re “worth” — that the rich deserve their ever-increasing incomes and wealth because they’re worth far more to the economy now than years ago (when the incomes and wealth of those at the top were more modest relative to everyone else’s).
The distribution of income and wealth increasingly depend on who has the power to set the rules of the game.
Those at the top are raking in record income and wealth compared to everyone else because:
1. CEOs have linked their pay to the stock market through stock options. They then use corporate stock buybacks to increase stock prices and time the sale of their options to those increases.
2. They get inside information about corporate profits and losses before the rest of the public and trade on that insider information. This is especially true of hedge fund managers, who specialize in getting insider information before other investors.
3. They create or work for companies that have monopolized their markets. This enables them to charge consumers higher prices than if they had to compete for those consumers. And it lets them keep wages low, because workers have fewer options of whom to work for.
4. They use their political influence to get changes in laws, regulations, and taxes that benefit themselves and their corporations, while harming those without this kind of influence — especially smaller competitors, consumers, and workers.
5. They were born into (or married into) wealth. These days, the most important predictor of someone’s future income and wealth in America is the income and wealth of their parents. Sixtypercent of all wealth is inherited. And we’re on the cusp of the biggest intergenerational transfer of wealth in history, from rich boomers to their children.
None of these reasons for the explosion of incomes and wealth at the top has anything to do with worth or merit. They have to do with power — or the power of one’s parents.
Meanwhile, the pay of average workers has stagnated because they have lost economic power and the political influence that goes with it. Corporations have kept a lid on wages by outsourcing work abroad, replacing workers with software, and preventing workers from unionizing.
Bottom line: Wealth doesn’t measure how hard someone has worked or what they deserve. It measures how well our economic system has worked for them.
The distribution of income and wealth increasingly depend on who has the power to set the rules of the game.
Those at the top are raking in record income and wealth compared to everyone else because:
1. CEOs have linked their pay to the stock market through stock options. They then use corporate stock buybacks to increase stock prices and time the sale of their options to those increases.
2. They get inside information about corporate profits and losses before the rest of the public and trade on that insider information. This is especially true of hedge fund managers, who specialize in getting insider information before other investors.
3. They create or work for companies that have monopolized their markets. This enables them to charge consumers higher prices than if they had to compete for those consumers. And it lets them keep wages low, because workers have fewer options of whom to work for.
4. They use their political influence to get changes in laws, regulations, and taxes that benefit themselves and their corporations, while harming those without this kind of influence — especially smaller competitors, consumers, and workers.
5. They were born into (or married into) wealth. These days, the most important predictor of someone’s future income and wealth in America is the income and wealth of their parents. Sixtypercent of all wealth is inherited. And we’re on the cusp of the biggest intergenerational transfer of wealth in history, from rich boomers to their children.
None of these reasons for the explosion of incomes and wealth at the top has anything to do with worth or merit. They have to do with power — or the power of one’s parents.
Meanwhile, the pay of average workers has stagnated because they have lost economic power and the political influence that goes with it. Corporations have kept a lid on wages by outsourcing work abroad, replacing workers with software, and preventing workers from unionizing.
Bottom line: Wealth doesn’t measure how hard someone has worked or what they deserve. It measures how well our economic system has worked for them.
Shattering deceptive mirrors: Younger generations have the chance to buck the beauty industry scam
The Millennial aunties’ letter to our Gen Z nieces: don’t let cosmetic companies win
By RAE HODGE - salon
Staff Reporter
PUBLISHED JANUARY 22, 2024 5:30AM (EST)
My darling Gen Z girls, it’s your aging Millennial aunties here. The childless, over-educated divorcees in skinny jeans who dyed your hair with Manic Panic, bought your first deck of tarot cards, drove you to the Women’s March and didn’t tell your parents about the pot (so long as you kept your grades up.) We’re so proud of you. You’ve shouldered burdens far heavier than we did at your age, have keener convictions and are a hundred times funnier. We love you ferociously. And now that you’re coming of age, as 23% of the world’s population and taking the lead in consumer spending, you’re finally ready to learn the most ancient Millennial art: how to brutally murder a luxury industry.
You’ve been treated with contempt by cosmetics companies. And they’ve gotten away with it too long. Enabled by insidious social media algorithms and inescapable surveillance of data-broker ad-tech, they subject you to billion-dollar psychological manipulation campaigns to keep you scrolling and buying crap you don’t need. The latest trend is “skin care” snake oil. Cosmetics companies have done little more than make so many of you starve, hide and hate yourselves. These companies deserve to die — so let’s kill them.
You’ve been treated with contempt by cosmetics companies. And they’ve gotten away with it too long. Enabled by insidious social media algorithms and inescapable surveillance of data-broker ad-tech, they subject you to billion-dollar psychological manipulation campaigns to keep you scrolling and buying crap you don’t need. The latest trend is “skin care” snake oil. Cosmetics companies have done little more than make so many of you starve, hide and hate yourselves. These companies deserve to die — so let’s kill them.
And who better to show you how than us? You see, the Boomers may not have realized it at the time, but when they plunged us into two Bushes and four recessions they turned Millennials into the apex predators of America’s economic ecosystem. A bit like the 40-year-old vagrant Wolverine from X-Men, we’re the eerily resilient PTSD byproducts of military-industrial experiments, filled with anger issues and toxic metals. We can’t pass down any financial tools (your Gen X grandparents got the last) but we can give you our deadliest financial weapon: The ability to break those who make you broke.
Grooming us for makeup
A 2023 Lending Tree survey of 1,950 US consumers found Millennials spend about $2,670 and Gen Z spend about $2,048 annually on beauty products. Mostly, it’s cleansers, toners and serums. Social media influenced 67% of Millennials’ and 64% of Gen Z’ers’ purchases. But you ladies are sharp. About 31% of Gen Z knows online skinfluencers are full of it.
Back in 2017, one marketing whitepaper found 68% of Gen Z girls felt “appearance is a somewhat or very significant source of stress.” Now that burden is laid on Gen Alpha girls, born 2010 and later.
“With the US beauty market reaching an impressive $71.5 billion in 2022, experiencing remarkable 6.1% year-over-year growth, there is immense potential to capitalize on the current inclusivity zeitgeist,” wrote Coresight Research last year, citing a survey of 5,690 teenagers with an average age of 16.
″[We] know from some of our proprietary research, as we enter into the holiday season, that skin care is one of the categories that is at the top of their list,” Ulta Beauty’s chief merchandizing officer told CNBC of Gen Alpha.
So do your aunties — and your younger sisters — one favor. Take a look at the skin care products used by tweens in the pictures of that CNBC article. Notice the shapes and colors of the packaging, how they are designed to be held and applied. Look at how the bottles and objects are visually indistinguishable from lipstick, eyeliner, mascara and foundation. It’s plain as day: the kid-targeted skin care cosmetic genre is just training wheels for makeup, disguised as being healthful rather than alluring, in order to avoid alarming your mothers and aunties.
“63% of female teenagers are in Ulta Beauty’s Ultamate Rewards Program,” wrote Coresight Research. “Teens’ core beauty wallet (cosmetics, skincare and fragrance) stands at $313 annually, a 19% year-over-year increase. This increase was driven by a 32% annual increase in spending on cosmetics, up to $123 annually … surpassing skincare spending for the first time since 2020.”
These companies know exactly what they’re doing. They’re grooming girls for makeup by easing them into it with “skin care” snake-oil. And it’s working. It worked on us. It worked on our parents. It worked on you. And now it’s working on your younger sisters.
A machine for self-hatred
We don’t want to preach about social media like hypocrites. But you’ve got to know what you’re up against, and we’d never ask you to stand on anything without receipts. It’s not hard to find peer-reviewed studies confirming links between social media, unhealthy body image and mental health problems in girls. They’ve spiked since COVID-19 lockdowns pushed more kids online.
In the Irish Journal of Psychological Medicine in 2020, researchers found girls’ body dissatisfaction directly related to time spent on social media. A 2022 study from the University of Delaware found teen girls’ body anxiety connected to other depressive symptoms (with a towering citation list). Studies in Obesity Reviews and Current Psychology found associations between social media exposure, mental health and teen diet in 2023. The same year, a Clinics in Dermatology study found social media can “hinder body dysmorphic disorder patient treatment, leading to excessive use of cosmetic procedures.”
Unsurprisingly, a 2023 review of 21 articles in the Journal of Psychosocial Nursing and Mental Health Services concurred, as did a 2023 review in PLOS Global Public Health.
“Evidence from 50 studies in 17 countries indicates that social media usage leads to body image concerns, eating disorders/disordered eating and poor mental health,” researchers wrote.
Finally, University of Western Australia researchers said in 2022:
“Adolescent girls appear more vulnerable to experiencing mental health difficulties from social media use than boys ... Sexual objectification through images may reinforce to adolescent girls that their value is based on their appearance.”
Are we saying abandon all social media? Not at all. We know you often have to be there. We do too. But caveat emptor, as they say — or “buyer beware” for those of you whose schools slashed Latin studies. Online platforms are Rube Goldberg machines for self-hatred. They pimp our attention spans to companies paying for ads — no matter how it harms our mental health— and then train us to perform for perpetual surveillance. Never underestimate their greed, never forget they conspire with the enemy.
Enough is enough. Makeup for fun and artistry sake is one thing, but we’ve lost too much money and self-esteem to digital con-artists who call us ugly. Your murderous Millennial aunties are with you. Now, let’s rip this industry apart and use its blood for lipstick.
You’ve been treated with contempt by cosmetics companies. And they’ve gotten away with it too long. Enabled by insidious social media algorithms and inescapable surveillance of data-broker ad-tech, they subject you to billion-dollar psychological manipulation campaigns to keep you scrolling and buying crap you don’t need. The latest trend is “skin care” snake oil. Cosmetics companies have done little more than make so many of you starve, hide and hate yourselves. These companies deserve to die — so let’s kill them.
You’ve been treated with contempt by cosmetics companies. And they’ve gotten away with it too long. Enabled by insidious social media algorithms and inescapable surveillance of data-broker ad-tech, they subject you to billion-dollar psychological manipulation campaigns to keep you scrolling and buying crap you don’t need. The latest trend is “skin care” snake oil. Cosmetics companies have done little more than make so many of you starve, hide and hate yourselves. These companies deserve to die — so let’s kill them.
And who better to show you how than us? You see, the Boomers may not have realized it at the time, but when they plunged us into two Bushes and four recessions they turned Millennials into the apex predators of America’s economic ecosystem. A bit like the 40-year-old vagrant Wolverine from X-Men, we’re the eerily resilient PTSD byproducts of military-industrial experiments, filled with anger issues and toxic metals. We can’t pass down any financial tools (your Gen X grandparents got the last) but we can give you our deadliest financial weapon: The ability to break those who make you broke.
Grooming us for makeup
A 2023 Lending Tree survey of 1,950 US consumers found Millennials spend about $2,670 and Gen Z spend about $2,048 annually on beauty products. Mostly, it’s cleansers, toners and serums. Social media influenced 67% of Millennials’ and 64% of Gen Z’ers’ purchases. But you ladies are sharp. About 31% of Gen Z knows online skinfluencers are full of it.
Back in 2017, one marketing whitepaper found 68% of Gen Z girls felt “appearance is a somewhat or very significant source of stress.” Now that burden is laid on Gen Alpha girls, born 2010 and later.
“With the US beauty market reaching an impressive $71.5 billion in 2022, experiencing remarkable 6.1% year-over-year growth, there is immense potential to capitalize on the current inclusivity zeitgeist,” wrote Coresight Research last year, citing a survey of 5,690 teenagers with an average age of 16.
″[We] know from some of our proprietary research, as we enter into the holiday season, that skin care is one of the categories that is at the top of their list,” Ulta Beauty’s chief merchandizing officer told CNBC of Gen Alpha.
So do your aunties — and your younger sisters — one favor. Take a look at the skin care products used by tweens in the pictures of that CNBC article. Notice the shapes and colors of the packaging, how they are designed to be held and applied. Look at how the bottles and objects are visually indistinguishable from lipstick, eyeliner, mascara and foundation. It’s plain as day: the kid-targeted skin care cosmetic genre is just training wheels for makeup, disguised as being healthful rather than alluring, in order to avoid alarming your mothers and aunties.
“63% of female teenagers are in Ulta Beauty’s Ultamate Rewards Program,” wrote Coresight Research. “Teens’ core beauty wallet (cosmetics, skincare and fragrance) stands at $313 annually, a 19% year-over-year increase. This increase was driven by a 32% annual increase in spending on cosmetics, up to $123 annually … surpassing skincare spending for the first time since 2020.”
These companies know exactly what they’re doing. They’re grooming girls for makeup by easing them into it with “skin care” snake-oil. And it’s working. It worked on us. It worked on our parents. It worked on you. And now it’s working on your younger sisters.
A machine for self-hatred
We don’t want to preach about social media like hypocrites. But you’ve got to know what you’re up against, and we’d never ask you to stand on anything without receipts. It’s not hard to find peer-reviewed studies confirming links between social media, unhealthy body image and mental health problems in girls. They’ve spiked since COVID-19 lockdowns pushed more kids online.
In the Irish Journal of Psychological Medicine in 2020, researchers found girls’ body dissatisfaction directly related to time spent on social media. A 2022 study from the University of Delaware found teen girls’ body anxiety connected to other depressive symptoms (with a towering citation list). Studies in Obesity Reviews and Current Psychology found associations between social media exposure, mental health and teen diet in 2023. The same year, a Clinics in Dermatology study found social media can “hinder body dysmorphic disorder patient treatment, leading to excessive use of cosmetic procedures.”
Unsurprisingly, a 2023 review of 21 articles in the Journal of Psychosocial Nursing and Mental Health Services concurred, as did a 2023 review in PLOS Global Public Health.
“Evidence from 50 studies in 17 countries indicates that social media usage leads to body image concerns, eating disorders/disordered eating and poor mental health,” researchers wrote.
Finally, University of Western Australia researchers said in 2022:
“Adolescent girls appear more vulnerable to experiencing mental health difficulties from social media use than boys ... Sexual objectification through images may reinforce to adolescent girls that their value is based on their appearance.”
Are we saying abandon all social media? Not at all. We know you often have to be there. We do too. But caveat emptor, as they say — or “buyer beware” for those of you whose schools slashed Latin studies. Online platforms are Rube Goldberg machines for self-hatred. They pimp our attention spans to companies paying for ads — no matter how it harms our mental health— and then train us to perform for perpetual surveillance. Never underestimate their greed, never forget they conspire with the enemy.
Enough is enough. Makeup for fun and artistry sake is one thing, but we’ve lost too much money and self-esteem to digital con-artists who call us ugly. Your murderous Millennial aunties are with you. Now, let’s rip this industry apart and use its blood for lipstick.
Bottled water contains hundreds of thousands of plastic bits: study
Agence France-Presse - raw story
January 9, 2024 6:53AM ET
Bottled water is up to a hundred times worse than previously thought when it comes to the number of tiny plastic bits it contains, a new study in the Proceedings of the National Academy of Sciences said Monday.
Using a recently invented technique, scientists counted on average 240,000 detectable fragments of plastic per liter of water in popular brands -- between 10-100 times higher than prior estimates -- raising potential health concerns that require further study.
"If people are concerned about nanoplastics in bottled water, it's reasonable to consider alternatives like tap water," Beizhan Yan, an associate research professor of geochemistry at Columbia University and a co-author of the paper told AFP.
But he added: "We do not advise against drinking bottled water when necessary, as the risk of dehydration can outweigh the potential impacts of nanoplastics exposure."
There has been rising global attention in recent years on microplastics, which break off from bigger sources of plastic and are now found everywhere from the polar ice caps to mountain peaks, rippling through ecosystems and finding their way into drinking water and food.
While microplastics are anything under 5 millimeters, nanoplastics are defined as particles below 1 micrometer, or a billionth of a meter -- so small they can pass through the digestive system and lungs, entering the bloodstream directly and from there to organs, including the brain and heart. They can also cross the placenta into the bodies of unborn babies.
There is limited research on their impacts on ecosystems and human health, though some early lab studies have linked them to toxic effects, including reproductive abnormalities and gastric issues.
To study nanoparticles in bottled water, the team used a technique called Stimulated Raman Scattering (SRS) microscopy, which was recently invented by one of the paper's co-authors, and works by probing samples with two lasers tuned to make specific molecules resonate, revealing what they are to a computer algorithm.
They tested three leading brands but chose not to name them, "because we believe all bottled water contain nanoplastics, so singling out three popular brands could be considered unfair," said Yan.
The results showed between 110,000 to 370,000 particles per liter, 90 percent of which were nanoplastics while the rest were microplastics.
The most common type was nylon -- which probably comes from plastic filters used to purify the water-- followed by polyethylene terephthalate or PET, which is what bottles are themselves made from, and leaches out when the bottle is squeezed. Other types of plastic enter the water when the cap is opened and closed.
Next, the team hopes to probe tap water, which has also been found to contain microplastics, though at far lower levels.
related: Bottled Water Contains 240,000 Plastic Particles per Liter, Study Finds
Using a recently invented technique, scientists counted on average 240,000 detectable fragments of plastic per liter of water in popular brands -- between 10-100 times higher than prior estimates -- raising potential health concerns that require further study.
"If people are concerned about nanoplastics in bottled water, it's reasonable to consider alternatives like tap water," Beizhan Yan, an associate research professor of geochemistry at Columbia University and a co-author of the paper told AFP.
But he added: "We do not advise against drinking bottled water when necessary, as the risk of dehydration can outweigh the potential impacts of nanoplastics exposure."
There has been rising global attention in recent years on microplastics, which break off from bigger sources of plastic and are now found everywhere from the polar ice caps to mountain peaks, rippling through ecosystems and finding their way into drinking water and food.
While microplastics are anything under 5 millimeters, nanoplastics are defined as particles below 1 micrometer, or a billionth of a meter -- so small they can pass through the digestive system and lungs, entering the bloodstream directly and from there to organs, including the brain and heart. They can also cross the placenta into the bodies of unborn babies.
There is limited research on their impacts on ecosystems and human health, though some early lab studies have linked them to toxic effects, including reproductive abnormalities and gastric issues.
To study nanoparticles in bottled water, the team used a technique called Stimulated Raman Scattering (SRS) microscopy, which was recently invented by one of the paper's co-authors, and works by probing samples with two lasers tuned to make specific molecules resonate, revealing what they are to a computer algorithm.
They tested three leading brands but chose not to name them, "because we believe all bottled water contain nanoplastics, so singling out three popular brands could be considered unfair," said Yan.
The results showed between 110,000 to 370,000 particles per liter, 90 percent of which were nanoplastics while the rest were microplastics.
The most common type was nylon -- which probably comes from plastic filters used to purify the water-- followed by polyethylene terephthalate or PET, which is what bottles are themselves made from, and leaches out when the bottle is squeezed. Other types of plastic enter the water when the cap is opened and closed.
Next, the team hopes to probe tap water, which has also been found to contain microplastics, though at far lower levels.
related: Bottled Water Contains 240,000 Plastic Particles per Liter, Study Finds
Medicare Advantage Plans: The Hidden Dangers and Threats to Patient Care
Medicare Advantage Plans Prioritize Profits at the Expense of Patient Well-being
JOE MANISCALCO - dc report
12/11/2023
The Medicare open enrollment period — that special time of year when the purveyors of profit-driven Medicare Advantage health insurance plans target retirees for the hard sell — ended this week. And for the first time ever, more than half of all Medicare-eligible recipients nationwide have swallowed the bait and signed up. And what exactly is so wrong with that?
Well, not a thing, opponents charge. Unless, of course, you get sick and prefer not to have some corporate bean counter determine whether or not you can get the test or begin the treatment your doctor prescribes.
It could also be a big problem if you happen to like your doctor and the specialist they recommend you see — but you later learn are not “in network,” or have decided to drop Medicare Advantage all together because they simply cannot tolerate all the built in frustration and bureaucratic hassle.
Private health insurance companies, in fact, are great at touting the lower out-of-pocket costs and SilverSneaker perks associated with profit-driven Medicare Advantage plans. But, they are loath to talk about the narrowed pool of available physicians Medicare Advantage recipients can actually see, and the prior authorizations that are erected between Medicare Advantage recipients and the care they need — all in the never-ending pursuit of profit.
Across the country, an increasing number of physicians are reporting the prior authorizations that come with profit-driven Medicare Advantage plans have, indeed, become a “nightmare” for both themselves and the patients they are attempting to treat.
Just last month, the 125-year-old American Hospital Association (AHA) loudly decried and sought redress from the “inappropriate denials of medically necessary care” that come with Medicare Advantage plans.
More frighteningly still, Physicians for a National Health Program (PNHP) point to data showing Medicare Advantage patients are being forced to wait longer for cancer surgery than those using authentic or “traditional” Medicare.
“And on top of having to wait longer for cancer surgery then patients with traditional Medicare,” Dr. Claudia Fegan, chief medical officer at Cook County Health in Illinois says, “cancer patients in Medicare Advantage plans are far less likely to use a National Cancer Institute–designated Center of Excellence.”
That means cancer patients enrolled in profit-driven Medicare Advantage plans are more likely to have their surgeries done at hospitals far less experienced in performing complex oncological procedures.
“It turns out,” Dr. Fegan says, “these delays in getting treatment and the redirection to the less experienced hospitals, is literally killing patients.”
Astonishingly, it gets worse, Medicare Advantage foes insist.
Medicare Advantage purveyors including UnitedHealthcare and Cigna are currently facing class action lawsuits for allegedly using AI (Artificial Intelligence) to further deny Medicare Advantage recipients vital, necessary care — again, all in the name of profit.
The ongoing drive to funnel Medicare eligible retirees into privatized health insurance plans has sparked a nationwide revolt, however.
Municipal retirees from coast-to-coast who were promised traditional Medicare benefits when they initially signed up for civil service are leading the fight against privatization and refusing to allow Medicare-eligible recipients to be herded into profit-driven Medicare Advantage plans.
And they are winning.
This past summer, a New York State Supreme Court Justice permanently blocked the City of New York, and embattled Mayor Eric Adams from stripping municipal retirees of their traditional Medicare benefits and pushing them into a profit-driven Medicare Advantage plan administered by insurance giant Aetna.
Public service retirees in Delaware, too, have also been successful blocking efforts in their state to push them into a privatized Medicare Advantage plan.
A Stay Order issued in Delaware Superior Court blocking Governor John Carney’s administration from herding public service retirees into Medicare Advantage was issued on October 19, 2022, and remains in effect despite ongoing efforts to have it overturned.
Municipal retirees from New York, Delaware, Seattle and other places around the country, in fact, are in the process of organizing into a potent united front against the Medicare Advantage push they say not only harms recipients, but threatens the survival of traditional Medicare itself.
Medicare Advantage foes were celebrated as national heroes during a “Save Medicare” rally held in front of the U.S. Capitol steps on July 27.
In September, U.S Representatives Ritchie Torres (NY-15) and Nicole Malliotakis (NY-11) introduced bipartisan legislation aimed at stopping employers nationwide from pushing retirees into profit-driven Medicare Advantage programs.
In addition to threatening the health of elderly recipients and undermining traditional Medicare, opponents of privatization have long maintained that organize labor’s sloppy embrace of Medicare Advantage will ultimately prove disastrous for an already beleaguered labor movement.
Why would anyone want to join a union, retired civil servants argue, if after a lifetime of paying dues and advancing union solidarity, union leadership can simply “sell off everything that they have” in retirement.
“That is a travesty, and that will cause the end of the Labor Movement,” Marianne Pizzitola, president of the New York City Organization of Public Service Retirees said in June.
Veteran single payer advocate Kay Tillow insists union leaders scrambling to find a solution to soaring healthcare costs by acquiescing to privatization and the Medicare Advantage push are making a huge mistake.
“They’ve chosen [a solution] that takes us backward and will hurt their future retirees as well — and it’s going to destroy medicare in the meantime,” Tillow said.
Well, not a thing, opponents charge. Unless, of course, you get sick and prefer not to have some corporate bean counter determine whether or not you can get the test or begin the treatment your doctor prescribes.
It could also be a big problem if you happen to like your doctor and the specialist they recommend you see — but you later learn are not “in network,” or have decided to drop Medicare Advantage all together because they simply cannot tolerate all the built in frustration and bureaucratic hassle.
Private health insurance companies, in fact, are great at touting the lower out-of-pocket costs and SilverSneaker perks associated with profit-driven Medicare Advantage plans. But, they are loath to talk about the narrowed pool of available physicians Medicare Advantage recipients can actually see, and the prior authorizations that are erected between Medicare Advantage recipients and the care they need — all in the never-ending pursuit of profit.
Across the country, an increasing number of physicians are reporting the prior authorizations that come with profit-driven Medicare Advantage plans have, indeed, become a “nightmare” for both themselves and the patients they are attempting to treat.
Just last month, the 125-year-old American Hospital Association (AHA) loudly decried and sought redress from the “inappropriate denials of medically necessary care” that come with Medicare Advantage plans.
More frighteningly still, Physicians for a National Health Program (PNHP) point to data showing Medicare Advantage patients are being forced to wait longer for cancer surgery than those using authentic or “traditional” Medicare.
“And on top of having to wait longer for cancer surgery then patients with traditional Medicare,” Dr. Claudia Fegan, chief medical officer at Cook County Health in Illinois says, “cancer patients in Medicare Advantage plans are far less likely to use a National Cancer Institute–designated Center of Excellence.”
That means cancer patients enrolled in profit-driven Medicare Advantage plans are more likely to have their surgeries done at hospitals far less experienced in performing complex oncological procedures.
“It turns out,” Dr. Fegan says, “these delays in getting treatment and the redirection to the less experienced hospitals, is literally killing patients.”
Astonishingly, it gets worse, Medicare Advantage foes insist.
Medicare Advantage purveyors including UnitedHealthcare and Cigna are currently facing class action lawsuits for allegedly using AI (Artificial Intelligence) to further deny Medicare Advantage recipients vital, necessary care — again, all in the name of profit.
The ongoing drive to funnel Medicare eligible retirees into privatized health insurance plans has sparked a nationwide revolt, however.
Municipal retirees from coast-to-coast who were promised traditional Medicare benefits when they initially signed up for civil service are leading the fight against privatization and refusing to allow Medicare-eligible recipients to be herded into profit-driven Medicare Advantage plans.
And they are winning.
This past summer, a New York State Supreme Court Justice permanently blocked the City of New York, and embattled Mayor Eric Adams from stripping municipal retirees of their traditional Medicare benefits and pushing them into a profit-driven Medicare Advantage plan administered by insurance giant Aetna.
Public service retirees in Delaware, too, have also been successful blocking efforts in their state to push them into a privatized Medicare Advantage plan.
A Stay Order issued in Delaware Superior Court blocking Governor John Carney’s administration from herding public service retirees into Medicare Advantage was issued on October 19, 2022, and remains in effect despite ongoing efforts to have it overturned.
Municipal retirees from New York, Delaware, Seattle and other places around the country, in fact, are in the process of organizing into a potent united front against the Medicare Advantage push they say not only harms recipients, but threatens the survival of traditional Medicare itself.
Medicare Advantage foes were celebrated as national heroes during a “Save Medicare” rally held in front of the U.S. Capitol steps on July 27.
In September, U.S Representatives Ritchie Torres (NY-15) and Nicole Malliotakis (NY-11) introduced bipartisan legislation aimed at stopping employers nationwide from pushing retirees into profit-driven Medicare Advantage programs.
In addition to threatening the health of elderly recipients and undermining traditional Medicare, opponents of privatization have long maintained that organize labor’s sloppy embrace of Medicare Advantage will ultimately prove disastrous for an already beleaguered labor movement.
Why would anyone want to join a union, retired civil servants argue, if after a lifetime of paying dues and advancing union solidarity, union leadership can simply “sell off everything that they have” in retirement.
“That is a travesty, and that will cause the end of the Labor Movement,” Marianne Pizzitola, president of the New York City Organization of Public Service Retirees said in June.
Veteran single payer advocate Kay Tillow insists union leaders scrambling to find a solution to soaring healthcare costs by acquiescing to privatization and the Medicare Advantage push are making a huge mistake.
“They’ve chosen [a solution] that takes us backward and will hurt their future retirees as well — and it’s going to destroy medicare in the meantime,” Tillow said.
Billionaire Philanthropy Is a Scam
A new study describes in grotesque detail the extent to which the ultrarich have perverted the charitable giving industry.
Jason Linkins
the new republic
November 18, 2023/3:00 a.m. ET
Money gets a bad rap in some quarters. It’s said that it “isn’t everything,” that it cannot “buy you happiness,” that loving it is “the root of all evil.” But if there’s one thing that money is absolutely stupendous at doing, it’s solving problems. Naturally, the more money you have, the more problems you can solve. Which is why the fact that we’ve allowed a large portion of an otherwise finite amount of wealth to become concentrated in the hands of an increasing number of billionaire plutocrats is something of a crisis: Since they have all the money, they call the shots on what problems get solved. And the main problem they want to solve is the public relations problem that’s arisen from their terrible ideas.
Naturally, the ultrarich put on a big show of generosity to temper your resolve to claw back their fortunes. Everywhere you look, their philanthropic endeavors thrive: They’re underwriting new academic buildings at the local university, providing the means by which your midsize city can have an orchestra, and furnishing the local hospital with state of the art equipment. And a sizable number of these deep-pocketed providers have banded together to create “The Giving Pledge,” a promise to give away half of their wealth during their lifetimes. It all sounds so pretty! But as a new report from the Institute for Policy Studies finds, these pledgers aren’t following through on their commitments—and the often self-serving nature of their philanthropy is actually making things worse for charitable organizations.
As the IPS notes, the business of being a billionaire—which suffered nary a hiccup during the pandemic—is booming. So one of the challenges that the Giving Pledgers face is that the rate at which they accrue wealth is making their promise harder to fulfill. The 73 pledgers “who were billionaires in 2010 saw their wealth grow by 138 percent, or 224 percent when adjusted for inflation, through 2022,” with combined assets ballooning from $348 billion to $828 billion.
According to the report, those who are making the effort to give aren’t handing their ducats over to normal charities. Instead, they are increasingly putting their money into intermediaries, such as private foundations or Donor Advised Funds, or DAFs. As the IPS notes, donations to “working charities appear to be declining” as foundations and DAFs become the preferred warehouses for philanthropic funds. (As TNR reported recently, DAFs are a favorite vehicle for anonymous donors to fund hate groups—while also pocketing a nice tax break.) This also has spurred some self-serving innovations among the philanthropic class, “such as taking out loans from their foundations or paying themselves hefty trustee salaries.” More and more of the pledgers are conflating their for-profit investments with their philanthropy as well. And wherever large pools of money are allowed to accrue, outsize political influence follows.
The shell games played by billionaire philanthropists are nothing new. The most common of these are the two-step process by which the ultrarich make charitable donations to solve a problem that their for-profit work caused in the first place. It’s nice that the Massachusetts Institute of Technology’s Institute for Integrative Cancer Research exists, but it’s soured somewhat knowing that the $100 million gift David H. Koch seeded it with was born from a profitable enterprise that included the carcinogens sold by Koch subsidiary Georgia-Pacific. In similar fashion, Mark Zuckerberg’s Chan Zuckerberg Initiative “handed out over $3m in grants to aid the housing crisis in the Silicon Valley area,” a problem that, Guardian contributors Carl Rhodes and Peter Bloom note, Zuckerberg had no small part in causing in the first place.
And at the top of the plutocratic food chain, a billionaire’s charitable enterprise can become a philanthropic Death Star. This week, The Baffler’s Tim Schwab took a deep dive into the Bill and Melinda Gates Foundation and discovered that the entity essentially exists as a public relations stunt to justify Gates’s own staggering wealth. One noteworthy highlight involved Gates reaching out to his upper-crust lessers during the Covid pandemic, seeking additional money on top of the foundation’s own commitment, creating a revenue stream that could tie an ethicist into a knot. “During a global pandemic, when billions of people were having trouble with day-to-day expenses even in wealthy nations,” Schwab asks, “why would an obscenely wealthy private foundation start competing for charitable donations against food banks and emergency housing funds?”
As the IPS study notes, perhaps the worst aspect of all of this is that ordinary taxpayers essentially subsidize these endeavors: According to their report, “$73.34 billion in tax revenue was lost to the public in 2022 due to personal and corporate charitable deductions,” a number that goes up to $111 billion once you include what “little data we have about charitable bequests and the investments of charities themselves,” and balloons to several hundreds of billions of dollars each year “if we also include the capital gains revenue lost from the donation of appreciated assets.”
The IPS offers a number of ideas for reforming the world of billionaire philanthropy to better serve the public interest. There are changes to the current regime of private foundations and Donor Advised Funds that would ensure that money flows to worthy recipients with greater speed and transparency. Regulations could ensure that such organizations aren’t just another means by which billionaires shower favors on board members—and that would give foundation board members greater independence to act on their own ideas and prevent the organization from being used as one rich person’s influence-peddling machine. But for my money, the one way we could solve this problem is to institute one of the most popular policy positions in the history of the United States, and tax the rich to the hilt.
Naturally, the ultrarich put on a big show of generosity to temper your resolve to claw back their fortunes. Everywhere you look, their philanthropic endeavors thrive: They’re underwriting new academic buildings at the local university, providing the means by which your midsize city can have an orchestra, and furnishing the local hospital with state of the art equipment. And a sizable number of these deep-pocketed providers have banded together to create “The Giving Pledge,” a promise to give away half of their wealth during their lifetimes. It all sounds so pretty! But as a new report from the Institute for Policy Studies finds, these pledgers aren’t following through on their commitments—and the often self-serving nature of their philanthropy is actually making things worse for charitable organizations.
As the IPS notes, the business of being a billionaire—which suffered nary a hiccup during the pandemic—is booming. So one of the challenges that the Giving Pledgers face is that the rate at which they accrue wealth is making their promise harder to fulfill. The 73 pledgers “who were billionaires in 2010 saw their wealth grow by 138 percent, or 224 percent when adjusted for inflation, through 2022,” with combined assets ballooning from $348 billion to $828 billion.
According to the report, those who are making the effort to give aren’t handing their ducats over to normal charities. Instead, they are increasingly putting their money into intermediaries, such as private foundations or Donor Advised Funds, or DAFs. As the IPS notes, donations to “working charities appear to be declining” as foundations and DAFs become the preferred warehouses for philanthropic funds. (As TNR reported recently, DAFs are a favorite vehicle for anonymous donors to fund hate groups—while also pocketing a nice tax break.) This also has spurred some self-serving innovations among the philanthropic class, “such as taking out loans from their foundations or paying themselves hefty trustee salaries.” More and more of the pledgers are conflating their for-profit investments with their philanthropy as well. And wherever large pools of money are allowed to accrue, outsize political influence follows.
The shell games played by billionaire philanthropists are nothing new. The most common of these are the two-step process by which the ultrarich make charitable donations to solve a problem that their for-profit work caused in the first place. It’s nice that the Massachusetts Institute of Technology’s Institute for Integrative Cancer Research exists, but it’s soured somewhat knowing that the $100 million gift David H. Koch seeded it with was born from a profitable enterprise that included the carcinogens sold by Koch subsidiary Georgia-Pacific. In similar fashion, Mark Zuckerberg’s Chan Zuckerberg Initiative “handed out over $3m in grants to aid the housing crisis in the Silicon Valley area,” a problem that, Guardian contributors Carl Rhodes and Peter Bloom note, Zuckerberg had no small part in causing in the first place.
And at the top of the plutocratic food chain, a billionaire’s charitable enterprise can become a philanthropic Death Star. This week, The Baffler’s Tim Schwab took a deep dive into the Bill and Melinda Gates Foundation and discovered that the entity essentially exists as a public relations stunt to justify Gates’s own staggering wealth. One noteworthy highlight involved Gates reaching out to his upper-crust lessers during the Covid pandemic, seeking additional money on top of the foundation’s own commitment, creating a revenue stream that could tie an ethicist into a knot. “During a global pandemic, when billions of people were having trouble with day-to-day expenses even in wealthy nations,” Schwab asks, “why would an obscenely wealthy private foundation start competing for charitable donations against food banks and emergency housing funds?”
As the IPS study notes, perhaps the worst aspect of all of this is that ordinary taxpayers essentially subsidize these endeavors: According to their report, “$73.34 billion in tax revenue was lost to the public in 2022 due to personal and corporate charitable deductions,” a number that goes up to $111 billion once you include what “little data we have about charitable bequests and the investments of charities themselves,” and balloons to several hundreds of billions of dollars each year “if we also include the capital gains revenue lost from the donation of appreciated assets.”
The IPS offers a number of ideas for reforming the world of billionaire philanthropy to better serve the public interest. There are changes to the current regime of private foundations and Donor Advised Funds that would ensure that money flows to worthy recipients with greater speed and transparency. Regulations could ensure that such organizations aren’t just another means by which billionaires shower favors on board members—and that would give foundation board members greater independence to act on their own ideas and prevent the organization from being used as one rich person’s influence-peddling machine. But for my money, the one way we could solve this problem is to institute one of the most popular policy positions in the history of the United States, and tax the rich to the hilt.
How Trump And Bush Tax Cuts For Billionaires Broke America
$10 Trillion in Added US Debt Since 2001 Shows 'Bush and Trump Tax Cuts Broke Our Modern Tax Structure
Jon Queally — crooks & liars
October 25, 2023
The U.S. Treasury Department on Friday released new figures related to the 2023 budget that showed a troubling drop in the nation's tax revenue compared to GDP—a measure which fell to 16.5% despite a growing economy—and an annual deficit increase that essentially doubled from the previous year.
"After record U.S. government spending in 2020 and 2021" due to programs related to the economic fallout from the Covid-19 crisis, the Washington Post reports, "the deficit dropped from close to $3 trillion to close to $1 trillion in 2022. But rather than continue to fall to its pre-pandemic levels, the deficit unexpectedly jumped this year to roughly $2 trillion."
While much of the reporting on the Treasury figures painted a picture of various and overlapping dynamics to explain the surge in the deficit—including higher payments on debt due to interest rates, tax filing waivers related to extreme weather events, the impact of a student loan forgiveness program that was later rescinded, or a dip in capital gains receipts—progressive tax experts say none of those complexities should act to shield what's at the heart of a budget that brings in less than it spends: tax giveaways to the rich.
Bobby Kogan, senior director for federal budget policy at the Center for American Progress, has argued repeatedly that growing deficits in recent years have a clear and singular chief cause: Republican tax cuts that benefit mostly the wealthy and profitable corporations.
In response to the Treasury figures released Friday, Kogan said that "roughly 75%" of the surge in the deficit and the debt ratio, the amount of federal debt relative to the overall size of the economy, was due to revenue decreases resulting from GOP-approved tax cuts over recent decades. "Of the remaining 25%," he said, "more than half" was higher interest payments on the debt related to Federal Reserve policy.
"We have a revenue problem, due to tax cuts," said Kogan, pointing to the major tax laws enacted under the administrations of George W. Bush and Donald Trump. "The Bush and Trump tax cuts broke our modern tax structure. Revenue is significantly lower and no longer grows much with the economy." And he offered this visualization about a growing debt ratio:
"The point I want to make again and again and again is that, relative to the last time CBO was projecting stable debt/GDP, spending is down, not up," Kogan said in a tweet Friday night. "It's lower revenue that's 100% responsible for the change in debt projections. If you take away nothing else, leave with this point."
---
In a detailed analysis produced in March, Kogan explained that, "If not for the Bush tax cuts and their extensions—as well as the Trump tax cuts—revenues would be on track to keep pace with spending indefinitely, and the debt ratio (debt as a percentage of the economy) would be declining. Instead, these tax cuts have added $10 trillion to the debt since their enactment and are responsible for 57 percent of the increase in the debt ratio since 2001, and more than 90 percent of the increase in the debt ratio if the one-time costs of bills responding to COVID-19 and the Great Recession are excluded."
On Friday, the office of Sen. Sheldon Whitehouse (D-R.I.) cited those same numbers in a press release responding to the Treasury's new report.
"Tax giveaways for the wealthy are continuing to starve the federal government of needed revenue: those passed by former Presidents Trump and Bush have added $10 trillion to the debt and account for 57 percent of the increase in the debt-to-GDP ratio since 2001," read the statement. "If not for those tax cuts, U.S. debt would be declining as a share of the economy."
Whitehouse, who chairs the Senate Budget Committee, said the dip in federal revenue and growth in the overall deficit both have the same primary cause: GOP fealty to the wealthy individuals and powerful corporations that bankroll their campaigns.
"In their blind loyalty to their mega-donors, Republicans' fixation on giant tax cuts for billionaires has created a revenue problem that is driving up our national debt," Whitehouse said Friday night. "Even as federal spending fell over the last year relative to the size of the economy, the deficit increased because Republicans have rigged the tax code so that big corporations and the wealthy can avoid paying their fair share."
Offering a solution, Whitehouse said, "Fixing our corrupted tax code and cracking down on wealthy tax cheats would help bring down the deficit. It would also ensure teachers and firefighters don't pay higher tax rates than billionaires, level the playing field for small businesses, and promote a stronger economy for all."
None of the latest figures—those showing that tax cuts have injured revenues and therefore spiked deficits and increased debt—should be a surprise.
In 2018, shortly after the Trump tax cuts were signed into law, a Congressional Budget Office (CBo) report predicted precisely this result: that revenues would plummet; annual deficits would grow; and not even the promise of economic growth made by Republicans to justify the giveaway would be enough to make up the difference in the budget.
"The CBO's latest report exposes the scam behind the rosy rhetoric from Republicans that their tax bill would pay for itself," Sen. Chuck Schumer (D-N.Y.), and now Senate Majority Leader, said at the time.
"Republicans racked up the national debt by giving tax breaks to their billionaire buddies, and now they want everyone else to pay for them."
In its 2018 report, the CBO predicted the deficit would rise to $804 billion by the end of that fiscal year. Now, for all the empty promises and howling from the GOP and their allied deficit hawks, the economic prescription they forced through Congress has resulted in an annual deficit of more than double that, all while demanding the nation's poorest and most vulnerable pay the price by demanding key social programs—including food aid, education budgets, unemployment benefits, and housing assistance—be slashed.
Meanwhile, the GOP majority in the U.S. House—with or without a Speaker currently holding the gavel—still has plans to extend the Trump tax cuts if given half a chance. In May, a CBO analysis of that pending legislation found that such an extension would add an additional $3.5 trillion to the national debt.
"Republicans racked up the national debt by giving tax breaks to their billionaire buddies, and now they want everyone else to pay for them," Sen. Whitehouse said at the time. "It is one of life's great enigmas that Republicans can keep a straight face while they simultaneously cite the deficit to extort massive spending cuts to critical programs and support a bill that would blow up deficits to extend trillions in tax cuts for the people who need them the least."
"After record U.S. government spending in 2020 and 2021" due to programs related to the economic fallout from the Covid-19 crisis, the Washington Post reports, "the deficit dropped from close to $3 trillion to close to $1 trillion in 2022. But rather than continue to fall to its pre-pandemic levels, the deficit unexpectedly jumped this year to roughly $2 trillion."
While much of the reporting on the Treasury figures painted a picture of various and overlapping dynamics to explain the surge in the deficit—including higher payments on debt due to interest rates, tax filing waivers related to extreme weather events, the impact of a student loan forgiveness program that was later rescinded, or a dip in capital gains receipts—progressive tax experts say none of those complexities should act to shield what's at the heart of a budget that brings in less than it spends: tax giveaways to the rich.
Bobby Kogan, senior director for federal budget policy at the Center for American Progress, has argued repeatedly that growing deficits in recent years have a clear and singular chief cause: Republican tax cuts that benefit mostly the wealthy and profitable corporations.
In response to the Treasury figures released Friday, Kogan said that "roughly 75%" of the surge in the deficit and the debt ratio, the amount of federal debt relative to the overall size of the economy, was due to revenue decreases resulting from GOP-approved tax cuts over recent decades. "Of the remaining 25%," he said, "more than half" was higher interest payments on the debt related to Federal Reserve policy.
"We have a revenue problem, due to tax cuts," said Kogan, pointing to the major tax laws enacted under the administrations of George W. Bush and Donald Trump. "The Bush and Trump tax cuts broke our modern tax structure. Revenue is significantly lower and no longer grows much with the economy." And he offered this visualization about a growing debt ratio:
"The point I want to make again and again and again is that, relative to the last time CBO was projecting stable debt/GDP, spending is down, not up," Kogan said in a tweet Friday night. "It's lower revenue that's 100% responsible for the change in debt projections. If you take away nothing else, leave with this point."
---
In a detailed analysis produced in March, Kogan explained that, "If not for the Bush tax cuts and their extensions—as well as the Trump tax cuts—revenues would be on track to keep pace with spending indefinitely, and the debt ratio (debt as a percentage of the economy) would be declining. Instead, these tax cuts have added $10 trillion to the debt since their enactment and are responsible for 57 percent of the increase in the debt ratio since 2001, and more than 90 percent of the increase in the debt ratio if the one-time costs of bills responding to COVID-19 and the Great Recession are excluded."
On Friday, the office of Sen. Sheldon Whitehouse (D-R.I.) cited those same numbers in a press release responding to the Treasury's new report.
"Tax giveaways for the wealthy are continuing to starve the federal government of needed revenue: those passed by former Presidents Trump and Bush have added $10 trillion to the debt and account for 57 percent of the increase in the debt-to-GDP ratio since 2001," read the statement. "If not for those tax cuts, U.S. debt would be declining as a share of the economy."
Whitehouse, who chairs the Senate Budget Committee, said the dip in federal revenue and growth in the overall deficit both have the same primary cause: GOP fealty to the wealthy individuals and powerful corporations that bankroll their campaigns.
"In their blind loyalty to their mega-donors, Republicans' fixation on giant tax cuts for billionaires has created a revenue problem that is driving up our national debt," Whitehouse said Friday night. "Even as federal spending fell over the last year relative to the size of the economy, the deficit increased because Republicans have rigged the tax code so that big corporations and the wealthy can avoid paying their fair share."
Offering a solution, Whitehouse said, "Fixing our corrupted tax code and cracking down on wealthy tax cheats would help bring down the deficit. It would also ensure teachers and firefighters don't pay higher tax rates than billionaires, level the playing field for small businesses, and promote a stronger economy for all."
None of the latest figures—those showing that tax cuts have injured revenues and therefore spiked deficits and increased debt—should be a surprise.
In 2018, shortly after the Trump tax cuts were signed into law, a Congressional Budget Office (CBo) report predicted precisely this result: that revenues would plummet; annual deficits would grow; and not even the promise of economic growth made by Republicans to justify the giveaway would be enough to make up the difference in the budget.
"The CBO's latest report exposes the scam behind the rosy rhetoric from Republicans that their tax bill would pay for itself," Sen. Chuck Schumer (D-N.Y.), and now Senate Majority Leader, said at the time.
"Republicans racked up the national debt by giving tax breaks to their billionaire buddies, and now they want everyone else to pay for them."
In its 2018 report, the CBO predicted the deficit would rise to $804 billion by the end of that fiscal year. Now, for all the empty promises and howling from the GOP and their allied deficit hawks, the economic prescription they forced through Congress has resulted in an annual deficit of more than double that, all while demanding the nation's poorest and most vulnerable pay the price by demanding key social programs—including food aid, education budgets, unemployment benefits, and housing assistance—be slashed.
Meanwhile, the GOP majority in the U.S. House—with or without a Speaker currently holding the gavel—still has plans to extend the Trump tax cuts if given half a chance. In May, a CBO analysis of that pending legislation found that such an extension would add an additional $3.5 trillion to the national debt.
"Republicans racked up the national debt by giving tax breaks to their billionaire buddies, and now they want everyone else to pay for them," Sen. Whitehouse said at the time. "It is one of life's great enigmas that Republicans can keep a straight face while they simultaneously cite the deficit to extort massive spending cuts to critical programs and support a bill that would blow up deficits to extend trillions in tax cuts for the people who need them the least."
From 1947 to 2023: Retracing the complex, tragic Israeli-Palestinian conflict
Agence France-Presse - raw story
October 11, 2023 11:31AM ET
The Israeli-Palestinian conflict was reignited once again on October 7 after a surprise offensive launched by Hamas against Israel. In retaliation, Israel ordered air strikes and a "complete siege" of the
1947: Thousands of European Jewish emigrants, many of them Holocaust survivors, board a ship – which came to be called Exodus 1947 – bound for then British-controlled Palestine. Heading for the “promised land”, they are intercepted by British naval ships and sent back to Europe. Widely covered by the media, the incident sparks international outrage and plays a critical role in convincing the UK that a UN-brokered solution is necessary to solve the Palestine crisis.
A UN special committee proposes a partition plan giving 56.47 percent of Palestine for a Jewish state and 44.53 percent for an Arab state. Palestinian representatives reject the plan, but their Jewish counterparts accept it.
On November 29, the UN General Assembly approves the plan, with 33 countries voting for partition, 13 voting against it and 10 abstentions.
1948-49: On May 14, David Ben-Gurion, Israel’s first prime minister, publicly reads the Proclamation of Independence. The declaration, which would go into effect the next day, comes a day ahead of the expiration of the British Mandate on Palestine. The Jewish state takes control of 77 percent of the territory of Mandate Palestine, according to the UN.
For Palestinians, this date marks the “Nakba”, the catastrophe that heralds their subsequent displacement and dispossession.
As hundreds of thousands of Palestinians, hearing word of massacres in villages such as Dir Yassin, flee towards Egypt, Lebanon, and Jordanian territory, the armies of Egypt, Syria, Lebanon, Jordan and Iraq attack Israel, launching the 1948 Arab-Israeli War.
The Arab armies are repelled, a ceasefire is declared and new borders – more favorable to Israel – are drawn. Jordan takes control of the West Bank and East Jerusalem while Egypt controls the Gaza Strip.
1956: The Second Arab-Israeli War, or the Suez Crisis, takes place after Egypt nationalizes the Suez Canal. In response Israel, the United Kingdom and France form an alliance and Israel occupies the Gaza Strip and the Sinai Peninsula. The Israeli army eventually withdraws its troops, under pressure from the US and the USSR.
1959: Yasser Arafat sets up the Palestinian organization Fatah in Gaza and Kuwait. It later becomes the main component of the Palestine Liberation Organisation (PLO).
1964: The PLO is created.
1967: The Third Arab-Israeli War, or the Six-Day War, between Israel and its Arab neighbors, results in a dramatic redrawing of the Middle East map. Israel seizes the West Bank and East Jerusalem, the Gaza Strip, the Sinai Peninsula and the Golan Heights.
1973: On October 6, during the Jewish holiday of Yom Kippur, Egyptian and Syrian armies launch offensives against Israel, marking the start of a new regional war. The Yom Kippur War, which ends 19 days later with Israel repelling the Arab armies, results in heavy casualties on all sides – at least several thousand deaths.
1979: An Israeli-Egyptian peace agreement is sealed in Washington following the Camp David Accords signed in 1978 by Egyptian President Anwar Sadat and Israeli Prime Minister Menachem Begin. According to the terms of this agreement, Egypt regains the Sinai Peninsula, which it had lost after the Six-Day War. Sadat becomes the first Arab leader to recognise the State of Israel.
1982: Under Defense Minister Ariel Sharon, Israeli troops storm into neighbouring Lebanon in a controversial military mission called “Operation Peace of Galilee”. The aim of the operation is to wipe out Palestinian guerrilla bases in southern Lebanon. But Israeli troops push all the way to the Lebanese capital of Beirut.
The subsequent routing of the PLO under Arafat leaves the Palestinian refugee camps in Lebanon essentially defenceless. From September 16 to 18, Lebanese Christian Phalangist militiamen – with ties to Israel – enter the camps of Sabra and Shatilla in Beirut, unleashing a brutal massacre that shocks the international community. The massacres, the subject of an Israeli inquiry popularly called the Kahane Commission, would subsequently cost Sharon his job as defence minister.
1987: Uprisings in Palestinian refugee camps in Gaza spread to the West Bank marking the start of the First Palestinian Intifada ("uprising" in Arabic). Nicknamed the "war of stones", the First Intifada lasts until 1993, costing more than 1,000 Palestinian lives. The image of the stone-throwing Palestinian demonstrators pitched against Israel’s military might comes to symbolise the Palestinian struggle.
It was also during this uprising that Hamas, influenced by the ideology of Egypt's Muslim Brotherhood, was born. From the outset, the Islamist movement favours armed struggle and rejects outright any legitimacy of an Israeli state.
1993: After months of frenetic secret negotiations, Yasser Arafat and Israeli Prime Minister Yitzhak Rabin sign the Oslo Accords. The accords see the creation of the Palestinian Authority, which gets administrative control of the West Bank and Gaza. On September 13 on the White House lawn, Arafat and Rabin exchange a historic handshake in the presence of US President Bill Clinton. The event is watched by over 400 million TV viewers across the world.
1995: On November 4, Rabin is assassinated by a Jewish right-wing extremist at a peace rally in Tel Aviv.
1996: Benjamin Netanyahu is elected prime minister for the first time.
2000: On September 28, Sharon provokes Palestinians by making a tour of Jerusalem’s Al-Aqsa/Temple Mount site as leader of the right-wing Likud party, sparking the Second Intifada, also known as the Al-Aqsa Intifada. It lasted until 2005, with 3,000 Palestinians and 1,000 Israelis killed over five years.
2001: Sharon is elected prime minister of Israel and breaks off contact with Arafat, who is subsequently confined to his compound in Ramallah.
2002: The Israeli government begins Operation Defensive Shield – the construction of a wall to separate Israel from the West Bank. The UN Security Council speaks for the first time of a coexistence between the two states of Israel and Palestine. The Israeli army lifts the siege on Ramallah.
2004: On March 22, Sheikh Ahmed Yassin, the paraplegic co-founder and spiritual leader of Hamas, is killed in an Israeli helicopter strike. Eight months later, on November 11, PLO chairman Arafat dies at a Paris hospital following a prolonged illness. Arafat's death has been the subject of controversy. Some experts believe he died of natural causes, while others are open to the possibility he was poisoned using polonium 210.
2005: Mahmoud Abbas is elected president of the Palestinian Authority. After a 38-year occupation, Israel pulls out of Gaza.
2006: On January 4, Prime Minister Sharon suffers a stroke and falls into a coma that he stays in until his death in 2014. Ehud Olmert takes over as prime minister and head of Sharon’s newly founded centrist party, Kadima.
Hamas sweeps the legislative elections in the Palestinian Territories, causing the US and EU to freeze direct aid to the Palestinian government.
Lebanese Islamic fundamentalist group Hezbollah launches rocket attacks on Israel and takes two Israeli soldiers captive. Israel retaliates with force and many civilians, mainly Lebanese, are killed. The war, widely viewed as a failure in Israel, led to mounting calls for Olmert to resign.
2007: Following months of internecine military fighting between Hamas and Fatah forces, Hamas seizes control of Gaza.
2008: On December 27, the Israel Defense Forces (IDF) launch a surprise offensive on Gaza, killing more than 200 people in one day. Shortly after, the IDF follows up with a two-week-long ground invasion of Gaza. A UN report concluded that both Israel and Hamas committed war crimes during the conflict.
2009: On January 18, Israel and Hamas declare unilateral ceasefires, ending the 22-day battle which killed more than 1,300 Palestinians as well as 13 Israelis.
2011: On March 27, Israel deploys an anti-rocket missile defense system called Iron Dome, which allows the country to intercept short-range rockets regularly fired from Gaza.
2012: Israeli forces kill top Hamas commander Ahmed al-Jaabari in an air strike on November 14 and follow with more strikes over an eight-day campaign during which Hamas retaliates by firing rockets at Jerusalem for the first time. More than 130 Palestinians as well as five Israelis are killed.
2014: In June, three Israeli teenagers are abducted and murdered near the West Bank city of Hebron. Israeli authorities blame Hamas for the incident and on July 8, launch multiple air strikes on Gaza, prompting an exchange of rocket fire with Hamas over a seven-week period. The Israeli missile strikes result in the deaths of more than 2,200 Palestinians in Gaza.
2018: On March 30, tens of thousands of Palestinians rally near the Israeli border in the Gaza Strip to protest Israel’s blockade of the enclave. Demonstrations continue for several months. At least 189 Palestinians were killed and more than 6,000 injured during these protests between the end of March and the end of December 2018, according to the Independent International Commission of Inquiry mandated by the UN Human Rights Council.
2021: Palestinian worshippers clash with Israeli police in May at Jerusalem's Al-Aqsa Mosque compound following weeks of mounting tension. Hamas unleashes a barrage of rockets into Israel after demanding Israeli forces withdraw from the compound. Israel responds with air strikes on Gaza, setting off an 11-day conflict resulting in the deaths of more than 200 people.
2022: Israel pounds Gaza with air strikes on August 5, killing a senior militant of the Islamic Jihad group and triggering retaliatory rocket fire from the Palestinian enclave. At least 40 Palestinians are killed in the three days of fighting that follow.
2023: Israeli forces kill nine Palestinian Islamic Jihad gunmen and civilians on January 26 in a raid on a flashpoint town in the occupied West Bank. Palestinian militants hit back by firing two rockets, triggering retaliation from Israel. No further casualties are reported.
On October 7, Hamas mounts an unprecedented, multipronged surprise attack on Israel with fighters infiltrating the heavily fortified Gaza border in several locations by air, land and sea. Israeli forces respond with air strikes on Gaza and military reinforcements to the border.
1947: Thousands of European Jewish emigrants, many of them Holocaust survivors, board a ship – which came to be called Exodus 1947 – bound for then British-controlled Palestine. Heading for the “promised land”, they are intercepted by British naval ships and sent back to Europe. Widely covered by the media, the incident sparks international outrage and plays a critical role in convincing the UK that a UN-brokered solution is necessary to solve the Palestine crisis.
A UN special committee proposes a partition plan giving 56.47 percent of Palestine for a Jewish state and 44.53 percent for an Arab state. Palestinian representatives reject the plan, but their Jewish counterparts accept it.
On November 29, the UN General Assembly approves the plan, with 33 countries voting for partition, 13 voting against it and 10 abstentions.
1948-49: On May 14, David Ben-Gurion, Israel’s first prime minister, publicly reads the Proclamation of Independence. The declaration, which would go into effect the next day, comes a day ahead of the expiration of the British Mandate on Palestine. The Jewish state takes control of 77 percent of the territory of Mandate Palestine, according to the UN.
For Palestinians, this date marks the “Nakba”, the catastrophe that heralds their subsequent displacement and dispossession.
As hundreds of thousands of Palestinians, hearing word of massacres in villages such as Dir Yassin, flee towards Egypt, Lebanon, and Jordanian territory, the armies of Egypt, Syria, Lebanon, Jordan and Iraq attack Israel, launching the 1948 Arab-Israeli War.
The Arab armies are repelled, a ceasefire is declared and new borders – more favorable to Israel – are drawn. Jordan takes control of the West Bank and East Jerusalem while Egypt controls the Gaza Strip.
1956: The Second Arab-Israeli War, or the Suez Crisis, takes place after Egypt nationalizes the Suez Canal. In response Israel, the United Kingdom and France form an alliance and Israel occupies the Gaza Strip and the Sinai Peninsula. The Israeli army eventually withdraws its troops, under pressure from the US and the USSR.
1959: Yasser Arafat sets up the Palestinian organization Fatah in Gaza and Kuwait. It later becomes the main component of the Palestine Liberation Organisation (PLO).
1964: The PLO is created.
1967: The Third Arab-Israeli War, or the Six-Day War, between Israel and its Arab neighbors, results in a dramatic redrawing of the Middle East map. Israel seizes the West Bank and East Jerusalem, the Gaza Strip, the Sinai Peninsula and the Golan Heights.
1973: On October 6, during the Jewish holiday of Yom Kippur, Egyptian and Syrian armies launch offensives against Israel, marking the start of a new regional war. The Yom Kippur War, which ends 19 days later with Israel repelling the Arab armies, results in heavy casualties on all sides – at least several thousand deaths.
1979: An Israeli-Egyptian peace agreement is sealed in Washington following the Camp David Accords signed in 1978 by Egyptian President Anwar Sadat and Israeli Prime Minister Menachem Begin. According to the terms of this agreement, Egypt regains the Sinai Peninsula, which it had lost after the Six-Day War. Sadat becomes the first Arab leader to recognise the State of Israel.
1982: Under Defense Minister Ariel Sharon, Israeli troops storm into neighbouring Lebanon in a controversial military mission called “Operation Peace of Galilee”. The aim of the operation is to wipe out Palestinian guerrilla bases in southern Lebanon. But Israeli troops push all the way to the Lebanese capital of Beirut.
The subsequent routing of the PLO under Arafat leaves the Palestinian refugee camps in Lebanon essentially defenceless. From September 16 to 18, Lebanese Christian Phalangist militiamen – with ties to Israel – enter the camps of Sabra and Shatilla in Beirut, unleashing a brutal massacre that shocks the international community. The massacres, the subject of an Israeli inquiry popularly called the Kahane Commission, would subsequently cost Sharon his job as defence minister.
1987: Uprisings in Palestinian refugee camps in Gaza spread to the West Bank marking the start of the First Palestinian Intifada ("uprising" in Arabic). Nicknamed the "war of stones", the First Intifada lasts until 1993, costing more than 1,000 Palestinian lives. The image of the stone-throwing Palestinian demonstrators pitched against Israel’s military might comes to symbolise the Palestinian struggle.
It was also during this uprising that Hamas, influenced by the ideology of Egypt's Muslim Brotherhood, was born. From the outset, the Islamist movement favours armed struggle and rejects outright any legitimacy of an Israeli state.
1993: After months of frenetic secret negotiations, Yasser Arafat and Israeli Prime Minister Yitzhak Rabin sign the Oslo Accords. The accords see the creation of the Palestinian Authority, which gets administrative control of the West Bank and Gaza. On September 13 on the White House lawn, Arafat and Rabin exchange a historic handshake in the presence of US President Bill Clinton. The event is watched by over 400 million TV viewers across the world.
1995: On November 4, Rabin is assassinated by a Jewish right-wing extremist at a peace rally in Tel Aviv.
1996: Benjamin Netanyahu is elected prime minister for the first time.
2000: On September 28, Sharon provokes Palestinians by making a tour of Jerusalem’s Al-Aqsa/Temple Mount site as leader of the right-wing Likud party, sparking the Second Intifada, also known as the Al-Aqsa Intifada. It lasted until 2005, with 3,000 Palestinians and 1,000 Israelis killed over five years.
2001: Sharon is elected prime minister of Israel and breaks off contact with Arafat, who is subsequently confined to his compound in Ramallah.
2002: The Israeli government begins Operation Defensive Shield – the construction of a wall to separate Israel from the West Bank. The UN Security Council speaks for the first time of a coexistence between the two states of Israel and Palestine. The Israeli army lifts the siege on Ramallah.
2004: On March 22, Sheikh Ahmed Yassin, the paraplegic co-founder and spiritual leader of Hamas, is killed in an Israeli helicopter strike. Eight months later, on November 11, PLO chairman Arafat dies at a Paris hospital following a prolonged illness. Arafat's death has been the subject of controversy. Some experts believe he died of natural causes, while others are open to the possibility he was poisoned using polonium 210.
2005: Mahmoud Abbas is elected president of the Palestinian Authority. After a 38-year occupation, Israel pulls out of Gaza.
2006: On January 4, Prime Minister Sharon suffers a stroke and falls into a coma that he stays in until his death in 2014. Ehud Olmert takes over as prime minister and head of Sharon’s newly founded centrist party, Kadima.
Hamas sweeps the legislative elections in the Palestinian Territories, causing the US and EU to freeze direct aid to the Palestinian government.
Lebanese Islamic fundamentalist group Hezbollah launches rocket attacks on Israel and takes two Israeli soldiers captive. Israel retaliates with force and many civilians, mainly Lebanese, are killed. The war, widely viewed as a failure in Israel, led to mounting calls for Olmert to resign.
2007: Following months of internecine military fighting between Hamas and Fatah forces, Hamas seizes control of Gaza.
2008: On December 27, the Israel Defense Forces (IDF) launch a surprise offensive on Gaza, killing more than 200 people in one day. Shortly after, the IDF follows up with a two-week-long ground invasion of Gaza. A UN report concluded that both Israel and Hamas committed war crimes during the conflict.
2009: On January 18, Israel and Hamas declare unilateral ceasefires, ending the 22-day battle which killed more than 1,300 Palestinians as well as 13 Israelis.
2011: On March 27, Israel deploys an anti-rocket missile defense system called Iron Dome, which allows the country to intercept short-range rockets regularly fired from Gaza.
2012: Israeli forces kill top Hamas commander Ahmed al-Jaabari in an air strike on November 14 and follow with more strikes over an eight-day campaign during which Hamas retaliates by firing rockets at Jerusalem for the first time. More than 130 Palestinians as well as five Israelis are killed.
2014: In June, three Israeli teenagers are abducted and murdered near the West Bank city of Hebron. Israeli authorities blame Hamas for the incident and on July 8, launch multiple air strikes on Gaza, prompting an exchange of rocket fire with Hamas over a seven-week period. The Israeli missile strikes result in the deaths of more than 2,200 Palestinians in Gaza.
2018: On March 30, tens of thousands of Palestinians rally near the Israeli border in the Gaza Strip to protest Israel’s blockade of the enclave. Demonstrations continue for several months. At least 189 Palestinians were killed and more than 6,000 injured during these protests between the end of March and the end of December 2018, according to the Independent International Commission of Inquiry mandated by the UN Human Rights Council.
2021: Palestinian worshippers clash with Israeli police in May at Jerusalem's Al-Aqsa Mosque compound following weeks of mounting tension. Hamas unleashes a barrage of rockets into Israel after demanding Israeli forces withdraw from the compound. Israel responds with air strikes on Gaza, setting off an 11-day conflict resulting in the deaths of more than 200 people.
2022: Israel pounds Gaza with air strikes on August 5, killing a senior militant of the Islamic Jihad group and triggering retaliatory rocket fire from the Palestinian enclave. At least 40 Palestinians are killed in the three days of fighting that follow.
2023: Israeli forces kill nine Palestinian Islamic Jihad gunmen and civilians on January 26 in a raid on a flashpoint town in the occupied West Bank. Palestinian militants hit back by firing two rockets, triggering retaliation from Israel. No further casualties are reported.
On October 7, Hamas mounts an unprecedented, multipronged surprise attack on Israel with fighters infiltrating the heavily fortified Gaza border in several locations by air, land and sea. Israeli forces respond with air strikes on Gaza and military reinforcements to the border.
vote gop, die early!!!
Red State Conservatives Are Dying Thanks to the People They Vote For
by Joan McCarter | the smirking chimp
October 9, 2023 - 7:04am
Republicans aren’t just at war with each other in the U.S. House of Representatives: They’ve been quietly at war with their own constituents for decades. The Washington Post has a lengthy case study of what this has meant for one state taken over by Republicans: Ohio. One study the Post cites estimates that roughly “1 in 5 Ohioans will die before they turn 65…. a similar life expectancy to residents of Slovakia and Ecuador, relatively poor countries.”
The Post looks at Ashtabula County, Ohio, and compares it with its neighbor across the Pennsylvania border, Erie County, and the next county over, Chautauqua in New York state. The three counties have all experienced the same economic woes over the past several decades, as industrial jobs disappeared and wages fell. “But Ashtabula residents are much more likely to die young, especially from smoking, diabetes-related complications or motor vehicle accidents, than people living in its sister counties in Pennsylvania and New York, states that have adopted more stringent public health measures,” the Post found.
The primary difference: Democratic versus Republican lawmakers and leaders. Democratic states have enacted legislation to protect public health—including measures like seat-belt laws, high tobacco taxes, and more generous Medicaid and safety net benefits. Ohio and other Republican states have not. The Post cites a study by Ellen Meara, a health economics and policy professor at the Harvard T.H. Chan School of Public Health, in which she looked at geographic disparities in premature mortality over recent decades. It’s not just geography, she told the Post. It’s politics.
Meara’s paper doesn’t explicitly say that. Instead, she and her fellow authors say that “health disparities across states may arise from long-run changes in state policies or health ‘investments,’” including things like “anti-smoking policies, expansions of Medicaid, income support, and norms around health behaviors.” In other words, the kinds of investments blue states—like California, Pennsylvania, and New York—have made.
Three decades ago, California and Ohio had comparable health outcomes, ranking in the middle of all the states. Since then, the more proactive and progressive California has seen its premature-death rate fall significantly. Ohio has not. “By 2017, California had the nation’s second-lowest mortality rates, falling behind only Minnesota; Ohio ranked 41st,” the Post’s analysis found.
While Ohio is the specific case study for the Post, they found the divide has increased nationwide.
Today, people in the South and Midwest, regions largely controlled by Republican state legislators, have increasingly higher chances of dying prematurely compared with those in the more Democratic Northeast and West, according to The Post’s analysis of death rates.
Those disparities are bound to increase over the coming years. Some of these studies are still looking at pre-COVID-19 statistics. As Charles Gaba has been chronicling for the past few years, the death rate from COVID-19 is higher in Republican-leaning areas. Republicans have made the COVID-19 pandemic a political fight, like in Florida where the actual person in charge of public health calls the vaccine “anti-human” and is urging Floridians to avoid the newest vaccines.
Studies are soon also going to have to account for states that have banned abortion and criminalized reproductive health care. A study last year from the Commonwealth Fund determined that “maternal death rates were 62 percent higher in 2020 in abortion-restriction states than in abortion-access states (28.8 vs. 17.8 per 100,000 births).”
One of the factors behind that is on full display in parts of Idaho, where there are no practicing OB-GYN physicians any more—they’ve left the state over fear of prosecution. That makes giving birth a lot more dangerous. Making it even worse, the state is now going to prosecute emergency room doctors who provide abortions to stabilize a patient’s health.
Gun safety legislation—or the absence of it—is another key difference contributing to higher premature-death rates in red states. In 2021, the states with the lowest rate of gun deaths were Hawaii, Massachusetts, New Jersey, Rhode Island, and New York. Mississippi, Louisiana, Wyoming, Missouri, and Alabama had the highest gun-death rates.
In all of these states, the so-called “party of life” has consistently proven that what it’s really about is actively enabling premature death. They’ve proven that by refusing to save lives by expanding Medicaid, by warring against basic science, by keeping people hungry and vulnerable, and by criminalizing doctors. All in the name of so-called “family values,” “freedom,” and the “sanctity of life.”
The Post looks at Ashtabula County, Ohio, and compares it with its neighbor across the Pennsylvania border, Erie County, and the next county over, Chautauqua in New York state. The three counties have all experienced the same economic woes over the past several decades, as industrial jobs disappeared and wages fell. “But Ashtabula residents are much more likely to die young, especially from smoking, diabetes-related complications or motor vehicle accidents, than people living in its sister counties in Pennsylvania and New York, states that have adopted more stringent public health measures,” the Post found.
The primary difference: Democratic versus Republican lawmakers and leaders. Democratic states have enacted legislation to protect public health—including measures like seat-belt laws, high tobacco taxes, and more generous Medicaid and safety net benefits. Ohio and other Republican states have not. The Post cites a study by Ellen Meara, a health economics and policy professor at the Harvard T.H. Chan School of Public Health, in which she looked at geographic disparities in premature mortality over recent decades. It’s not just geography, she told the Post. It’s politics.
Meara’s paper doesn’t explicitly say that. Instead, she and her fellow authors say that “health disparities across states may arise from long-run changes in state policies or health ‘investments,’” including things like “anti-smoking policies, expansions of Medicaid, income support, and norms around health behaviors.” In other words, the kinds of investments blue states—like California, Pennsylvania, and New York—have made.
Three decades ago, California and Ohio had comparable health outcomes, ranking in the middle of all the states. Since then, the more proactive and progressive California has seen its premature-death rate fall significantly. Ohio has not. “By 2017, California had the nation’s second-lowest mortality rates, falling behind only Minnesota; Ohio ranked 41st,” the Post’s analysis found.
While Ohio is the specific case study for the Post, they found the divide has increased nationwide.
Today, people in the South and Midwest, regions largely controlled by Republican state legislators, have increasingly higher chances of dying prematurely compared with those in the more Democratic Northeast and West, according to The Post’s analysis of death rates.
Those disparities are bound to increase over the coming years. Some of these studies are still looking at pre-COVID-19 statistics. As Charles Gaba has been chronicling for the past few years, the death rate from COVID-19 is higher in Republican-leaning areas. Republicans have made the COVID-19 pandemic a political fight, like in Florida where the actual person in charge of public health calls the vaccine “anti-human” and is urging Floridians to avoid the newest vaccines.
Studies are soon also going to have to account for states that have banned abortion and criminalized reproductive health care. A study last year from the Commonwealth Fund determined that “maternal death rates were 62 percent higher in 2020 in abortion-restriction states than in abortion-access states (28.8 vs. 17.8 per 100,000 births).”
One of the factors behind that is on full display in parts of Idaho, where there are no practicing OB-GYN physicians any more—they’ve left the state over fear of prosecution. That makes giving birth a lot more dangerous. Making it even worse, the state is now going to prosecute emergency room doctors who provide abortions to stabilize a patient’s health.
Gun safety legislation—or the absence of it—is another key difference contributing to higher premature-death rates in red states. In 2021, the states with the lowest rate of gun deaths were Hawaii, Massachusetts, New Jersey, Rhode Island, and New York. Mississippi, Louisiana, Wyoming, Missouri, and Alabama had the highest gun-death rates.
In all of these states, the so-called “party of life” has consistently proven that what it’s really about is actively enabling premature death. They’ve proven that by refusing to save lives by expanding Medicaid, by warring against basic science, by keeping people hungry and vulnerable, and by criminalizing doctors. All in the name of so-called “family values,” “freedom,” and the “sanctity of life.”
How the GOP Suckered America on Tax Cuts
“Wait a minute!” I can hear you saying. “Cutting taxes on rich people makes them richer, but cutting taxes on working class people cuts their pay? WTF?!?”
THOM HARTMANN
SEP 4, 2023
It’s labor day, so let’s look at the radically different ways income tax cuts or increases affect working class people versus the morbidly rich.
Over the past 40 years, Republicans have pulled off an incredible magic trick. They’ve convinced average working people that tax cuts benefit them when in fact the opposite is true.
It all boils down to two simple principles, which are — unfortunately — a mystery to most Americans and ignored in both our political and media discussions of income taxes.
1. Income tax cuts for the morbidly rich raise a nation’s debt but do nothing else. Reagan’s BS “trickle down” claims notwithstanding, tax cuts for the rich don’t even stimulate economic growth: they just fatten billionaires’ money bins and offshore accounts. And because tax cuts on the rich are paid for by increasing the national debt, they’re a drag on the economy. They make rich people richer, but make the nation poorer.
2. Cutting income taxes on working-class people, however, actually cuts their base pay over the long run. And, paradoxically, when income taxes on working people go up, as they did in the 1930s through the 1960s, it generally leads to pay increases! This shocking and counterintuitive reality is something no politician since FDR has had the courage to explain.
“Wait a minute!” I can hear you saying. “Cutting taxes on rich people makes them richer, but cutting taxes on working class people cuts their pay? WTF?!?”
Here’s how it works with a short story thrown in.
Some years ago I did my radio program for a week from the studios of Danish Radio in Copenhagen.
Speaking with one of the more conservative members of Parliament, I asked why the Danish people didn’t revolt over an average 52% income tax rate on working people, with an even higher rate on really high earners?
He pointed out to me that the average Dane was doing just fine, with a minimum wage that averaged about $18 an hour, free college and free healthcare, not to mention four weeks of paid vacation every year and notoriety as the happiest nation on earth, according to a study done by the University of Leicester in the United Kingdom.
“You Americans are such suckers,” he told me and I reported some years ago. “You think the rules for taxes that apply to rich people also apply to working people, but they don’t.
“When working people’s taxes go up,” he said, “their pay also goes up over time. When their taxes go down, their pay goes down. It may take a year or two or three to all even out, but it always works that way — look at any country in Europe. And that rule on taxes is the exact opposite of how it works for rich people!”
Economist David Ricardo explained this in 1817 with his “Iron Law of Wages,” laid out in his book On the Principles of Political Economy and Taxation.
Ricardo pointed out that in the working class “labor marketplace,” before-tax income is pretty much irrelevant. The money people live on, the money that defines the “marketplace for labor,” is take-home pay.
After-tax income.
But the rules for how taxes work are completely different for rich people.
When taxes go down on rich people, they simply keep the money that they saved with the tax cut. They use it to stuff larger wads of cash into their money bins.
When taxes go up on them, they’ll just raise their own pay — until they hit a confiscatory tax rate (which hasn’t existed since the Reagan Revolution), and then they’ll stop giving themselves raises and leave the money in their company.
And, history shows, while keeping that money in their company to avoid a high top tax bracket, employers typically pay their workers more over time as well.
In other words, as taxes go up, income typically goes up for working class people but goes down for the very rich: High tax brackets discourage exploitation by the very rich and push up wages for working class people.
We saw this throughout the 1940–1980 period; income at the very tip-top was stable at about 30 times worker’s wages because rich people didn’t want to get pushed into that very tip-top tax bracket of 74%.
But for working class people, Ricardo pointed out 200 years ago, the rules are completely different.
When working class people end up with more after-tax money as a consequence of a tax cut, their employers realize that they’re receiving more than the “market for labor“ would require.
And over time the “Iron Law” dictates that employers will cut back those wages when working class people get a tax cut.
For example, if the average worker on an automobile assembly line made $30,000 a year in take-home pay, all the car manufacturing companies know that $30K in their pockets is what people will build cars for. It’s the set-point in the “market for labor” for that industry or type of job.
Because of income taxes, both federal, state and local, an auto worker may need a gross, pre-tax income of $40,000 a year to end up with that $30,000 take-home pay, so that $40,000 gross (before-tax) income becomes the average pay across the industry. At that pay and tax rate, workers end up taking home $30,000 a year.
But what happens if that income tax for working-class people is cut in half?
Now, a $40,000 a year autoworker’s salary produces $35,000 a year in take-home pay, and employers in the auto industry know that that’s $5000 a year more than they have to pay to hire new people to build cars.
Put another way, the employers know that they can hire people in the labor market for $30,000 a year take-home pay, which is now a gross salary of $35,000, so they begin lowering their $40,000 gross wage
offerings toward $35,000 to make up for the tax cut and to keep take-home pay within the $30,000 “market for wages.”
Since Reagan’s massive tax cut, we’ve seen this very phenomenon in the auto industry itself! As taxes went down, pay has been more than cut in half for new hires.
In other words, income tax cuts don’t increase the take-home pay of working people who have little control over their salaries. It’s the opposite, in fact.
On the other hand, when income taxes on working people increase, employers have to raise working class wages so their workers’ take-home pay stays the same. And that’s exactly what happened in the period from the 1940s to the 1980s as tax rates were fairly high across the board.
But when income taxes on working people go down, employers will reduce the wages they offer over time to keep their workers’ take-home pay at the same level. That, after all, is what Ricardo’s “market for labor” specifies.
But the rules are completely different for the rich, who live outside the “Iron Law of Labor.”
When taxes change for the very rich, they take home less money when taxes go up and keep more money when taxes go down. It’s the opposite of what happens to working-class people.
The incredible magic trick that the morbidly rich have done in America over the past 40 years is to convince average working people that the tax rules for the rich also apply to working class people, and therefore tax cuts benefit average workers, too.
Economist have known since the early 1800s that this is nonsense, as David Ricardo and many others have pointed out.
But after decades of this “you should worry about tax increases the same way rich people do” message being pounded into our brains by Republican politicians, working people think that tax cuts benefit them and tax increases hurt them.
It’s a real testimonial to the power of the Republican propaganda machine that even though individual wages have been flat or even declining in many industries for the past 40 years because of Republican tax cuts, the average American still thinks tax cuts are a good thing for them.
In fact, the time of greatest prosperity for the working class, when working class take-home pay (and wealth) was increasing faster than the income (and wealth) of the top 1%, was the period from 1940 to 1980 when taxes were high and the nation was prosperous.
FDR raised the top tax bracket to 91% and it stayed there through his administration, as well as those of Truman, Eisenhower, JFK and the early years of LBJ. President Johnson dropped it to 74%, which held through his administration as well as those of Nixon, Ford and Carter.
This high-tax period was the time of maximum American working class prosperity.
Reagan’s massive tax cuts in the 1980s put an end to that and started the explosion of wealth at the top which has led America to produce over 700 billionaires today. And gutted America’s ability to maintain first-class infrastructure.
Another way to put this simply is that tax cuts and tax increases on working class people are essentially irrelevant: tax cuts only help the morbidly rich, while tax reduce the national debt and help fund infrastructure and other programs that benefit working class people.
Over time, Ricardo’s “market for labor” will always normalize wages, regardless of tax rates on working-class people. But that rule does not apply to rich people because they can simply change their own income in response to tax policy: they are the only winners from tax cuts.
To stabilize our economy and re-empower working people, we must bring back the top tax bracket that existed before the Reagan Revolution. It’ll also provide the necessary funds to rebuild our country from the wreckage of Reagan’s neoliberal policies, which are largely still in place.
By taxing income in the very top brackets at a rate well above 50%, ideally the 74% rate we had before Reagan, we stabilize the economy, stop the relentless poaching of working peoples’ wages for the money bins of the rich, and begin restoring our middle class.
And raising taxes on the morbidly rich also funds government programs that support the middle class: things like healthcare, free public education, and anti-poverty programs.
It’s time to re-normalize taxes on the morbidly rich (and leave them where they are on working class people) so we can again have a growing economy and a prosperous middle class.
Over the past 40 years, Republicans have pulled off an incredible magic trick. They’ve convinced average working people that tax cuts benefit them when in fact the opposite is true.
It all boils down to two simple principles, which are — unfortunately — a mystery to most Americans and ignored in both our political and media discussions of income taxes.
1. Income tax cuts for the morbidly rich raise a nation’s debt but do nothing else. Reagan’s BS “trickle down” claims notwithstanding, tax cuts for the rich don’t even stimulate economic growth: they just fatten billionaires’ money bins and offshore accounts. And because tax cuts on the rich are paid for by increasing the national debt, they’re a drag on the economy. They make rich people richer, but make the nation poorer.
2. Cutting income taxes on working-class people, however, actually cuts their base pay over the long run. And, paradoxically, when income taxes on working people go up, as they did in the 1930s through the 1960s, it generally leads to pay increases! This shocking and counterintuitive reality is something no politician since FDR has had the courage to explain.
“Wait a minute!” I can hear you saying. “Cutting taxes on rich people makes them richer, but cutting taxes on working class people cuts their pay? WTF?!?”
Here’s how it works with a short story thrown in.
Some years ago I did my radio program for a week from the studios of Danish Radio in Copenhagen.
Speaking with one of the more conservative members of Parliament, I asked why the Danish people didn’t revolt over an average 52% income tax rate on working people, with an even higher rate on really high earners?
He pointed out to me that the average Dane was doing just fine, with a minimum wage that averaged about $18 an hour, free college and free healthcare, not to mention four weeks of paid vacation every year and notoriety as the happiest nation on earth, according to a study done by the University of Leicester in the United Kingdom.
“You Americans are such suckers,” he told me and I reported some years ago. “You think the rules for taxes that apply to rich people also apply to working people, but they don’t.
“When working people’s taxes go up,” he said, “their pay also goes up over time. When their taxes go down, their pay goes down. It may take a year or two or three to all even out, but it always works that way — look at any country in Europe. And that rule on taxes is the exact opposite of how it works for rich people!”
Economist David Ricardo explained this in 1817 with his “Iron Law of Wages,” laid out in his book On the Principles of Political Economy and Taxation.
Ricardo pointed out that in the working class “labor marketplace,” before-tax income is pretty much irrelevant. The money people live on, the money that defines the “marketplace for labor,” is take-home pay.
After-tax income.
But the rules for how taxes work are completely different for rich people.
When taxes go down on rich people, they simply keep the money that they saved with the tax cut. They use it to stuff larger wads of cash into their money bins.
When taxes go up on them, they’ll just raise their own pay — until they hit a confiscatory tax rate (which hasn’t existed since the Reagan Revolution), and then they’ll stop giving themselves raises and leave the money in their company.
And, history shows, while keeping that money in their company to avoid a high top tax bracket, employers typically pay their workers more over time as well.
In other words, as taxes go up, income typically goes up for working class people but goes down for the very rich: High tax brackets discourage exploitation by the very rich and push up wages for working class people.
We saw this throughout the 1940–1980 period; income at the very tip-top was stable at about 30 times worker’s wages because rich people didn’t want to get pushed into that very tip-top tax bracket of 74%.
But for working class people, Ricardo pointed out 200 years ago, the rules are completely different.
When working class people end up with more after-tax money as a consequence of a tax cut, their employers realize that they’re receiving more than the “market for labor“ would require.
And over time the “Iron Law” dictates that employers will cut back those wages when working class people get a tax cut.
For example, if the average worker on an automobile assembly line made $30,000 a year in take-home pay, all the car manufacturing companies know that $30K in their pockets is what people will build cars for. It’s the set-point in the “market for labor” for that industry or type of job.
Because of income taxes, both federal, state and local, an auto worker may need a gross, pre-tax income of $40,000 a year to end up with that $30,000 take-home pay, so that $40,000 gross (before-tax) income becomes the average pay across the industry. At that pay and tax rate, workers end up taking home $30,000 a year.
But what happens if that income tax for working-class people is cut in half?
Now, a $40,000 a year autoworker’s salary produces $35,000 a year in take-home pay, and employers in the auto industry know that that’s $5000 a year more than they have to pay to hire new people to build cars.
Put another way, the employers know that they can hire people in the labor market for $30,000 a year take-home pay, which is now a gross salary of $35,000, so they begin lowering their $40,000 gross wage
offerings toward $35,000 to make up for the tax cut and to keep take-home pay within the $30,000 “market for wages.”
Since Reagan’s massive tax cut, we’ve seen this very phenomenon in the auto industry itself! As taxes went down, pay has been more than cut in half for new hires.
In other words, income tax cuts don’t increase the take-home pay of working people who have little control over their salaries. It’s the opposite, in fact.
On the other hand, when income taxes on working people increase, employers have to raise working class wages so their workers’ take-home pay stays the same. And that’s exactly what happened in the period from the 1940s to the 1980s as tax rates were fairly high across the board.
But when income taxes on working people go down, employers will reduce the wages they offer over time to keep their workers’ take-home pay at the same level. That, after all, is what Ricardo’s “market for labor” specifies.
But the rules are completely different for the rich, who live outside the “Iron Law of Labor.”
When taxes change for the very rich, they take home less money when taxes go up and keep more money when taxes go down. It’s the opposite of what happens to working-class people.
The incredible magic trick that the morbidly rich have done in America over the past 40 years is to convince average working people that the tax rules for the rich also apply to working class people, and therefore tax cuts benefit average workers, too.
Economist have known since the early 1800s that this is nonsense, as David Ricardo and many others have pointed out.
But after decades of this “you should worry about tax increases the same way rich people do” message being pounded into our brains by Republican politicians, working people think that tax cuts benefit them and tax increases hurt them.
It’s a real testimonial to the power of the Republican propaganda machine that even though individual wages have been flat or even declining in many industries for the past 40 years because of Republican tax cuts, the average American still thinks tax cuts are a good thing for them.
In fact, the time of greatest prosperity for the working class, when working class take-home pay (and wealth) was increasing faster than the income (and wealth) of the top 1%, was the period from 1940 to 1980 when taxes were high and the nation was prosperous.
FDR raised the top tax bracket to 91% and it stayed there through his administration, as well as those of Truman, Eisenhower, JFK and the early years of LBJ. President Johnson dropped it to 74%, which held through his administration as well as those of Nixon, Ford and Carter.
This high-tax period was the time of maximum American working class prosperity.
Reagan’s massive tax cuts in the 1980s put an end to that and started the explosion of wealth at the top which has led America to produce over 700 billionaires today. And gutted America’s ability to maintain first-class infrastructure.
Another way to put this simply is that tax cuts and tax increases on working class people are essentially irrelevant: tax cuts only help the morbidly rich, while tax reduce the national debt and help fund infrastructure and other programs that benefit working class people.
Over time, Ricardo’s “market for labor” will always normalize wages, regardless of tax rates on working-class people. But that rule does not apply to rich people because they can simply change their own income in response to tax policy: they are the only winners from tax cuts.
To stabilize our economy and re-empower working people, we must bring back the top tax bracket that existed before the Reagan Revolution. It’ll also provide the necessary funds to rebuild our country from the wreckage of Reagan’s neoliberal policies, which are largely still in place.
By taxing income in the very top brackets at a rate well above 50%, ideally the 74% rate we had before Reagan, we stabilize the economy, stop the relentless poaching of working peoples’ wages for the money bins of the rich, and begin restoring our middle class.
And raising taxes on the morbidly rich also funds government programs that support the middle class: things like healthcare, free public education, and anti-poverty programs.
It’s time to re-normalize taxes on the morbidly rich (and leave them where they are on working class people) so we can again have a growing economy and a prosperous middle class.
'Misleading': Alarm raised about Medicare Advantage 'scam'
Brett Wilkins, Common Dreams - raw story
July 26, 2023, 5:15 AM ET
Democratic U.S. lawmakers on Tuesday joined senior citizens, people with disabilities, and healthcare campaigners at a Capitol Hill press conference to kick off a week of action demanding Congress move to stop abuses by so-called Medicare Advantage programs peddled by profiteering insurance companies and "reclaim Medicare."
"We are here to raise the alarm about Medicare Advantage. We are here to protect our Medicare," Sen. Elizabeth Warren (D-Mass.) said to robust applause.
"This year, for the very first time, more than half of all beneficiaries are enrolled in Medicare Advantage instead of traditional Medicare," she continued. "But Medicare Advantage substitutes private insurance companies for traditional Medicare coverage, and that private coverage is failing both Medicare beneficiaries and taxpayers."
"Not only do Medicare Advantage insurers rip off the government, they routinely deny care to patients who need it."
"It's all about the money," Warren said. "Private insurers are in Medicare Advantage to play games to extract more money from the government."
"Experts estimate that Medicare Advantage insurers will receive more than $75 billion in overpayments this year alone, and that's the real punch to the gut," she continued. "Not only do Medicare Advantage insurers rip off the government, they routinely deny care to patients who need it."
"Seniors and people with disabilities who rely on Medicare deserve better," the senator affirmed. "We can strengthen traditional Medicare, and by doing that, we can save money and we can use some of those savings to expand benefits, like hearing, dental, and vision... and add an out-of-pocket cap for all beneficiaries... and lower the eligibility age for Medicare."
"Medicare money should be spent to deliver services for people," Warren added, "not to boost profits for insurance."
Rep. Rosa DeLauro (D-Conn.) said that "it is time to call out so-called Medicare Advantage for what it is. It's private insurance that profits by denying coverage and using the name of Medicare to trick our seniors."
Universal healthcare activist Ady Barkan, who founded the advocacy group Be a Hero—an event sponsor—asserted that "healthcare is a human right, and Medicare should be a rock-solid guarantee to that fundamental right."
"It should be a beacon of mutual responsibility and solidarity in the wake of 50 years of neoliberal ideology—a pillar of love, standing tall in a world too often dominated by greed," he added. "Health insurance corporations are doing everything they can to destroy this vision. That's why Be A Hero is leading this week of action to fight back."
Like the lawmakers, Alex Lawson, executive director of the advocacy group Social Security Works, blasted "bad actors in Medicare Advantage" who he said "are delaying and denying the care seniors and people with disabilities need."
"Corporate insurance is designed to generate profits by delaying and denying care, harming and killing patients instead of providing care," Lawson added.
Among the patients who spoke at Tuesday's event were Jen Coffey, who described to the barriers she's faced while seeking lifesaving care for complex regional pain syndrome caused by breast cancer under a UnitedHealthcare Medicare Advantage plan.
"I'm tired of talking to insurance employees who get to override the care and medications my doctors order while having not one shred of medical knowledge to make that decision with," she said. "I want medical freedom where my care decisions are made by my providers and me, not a representative reading out of a manual or a computer algorithm."
Wendell Potter, who heads the Center for Health and Democracy, repeated the common refrain that "so-called Medicare Advantage is neither Medicare nor an advantage. It is simply another scheme by the insurance companies to line their pockets at the expense of consumers by denying and delaying care."
"The healthcare market is confusing for consumers and the misleading branding of calling private insurance Medicare only makes this worse," Potter stressed.
Zenei Triunfo-Cortez, a registered nurse and president of the National Nurses United union, implored Congress "to take immediate action to prevent delays and denials of care in Medicare Advantage" and "support improvements to traditional Medicare and the expansion of traditional Medicare to cover every person living in the United States."
Earlier this year, Democratic U.S. Reps Mark Pocan (Wis.), Jan Schakowsky (Ill.), and Ro Khanna (Calif.) reintroduced legislation that would prohibit insurance companies from using the word "Medicare" in their health plans.
Progressive lawmakers have also criticized President Joe Biden for delaying promised curbs on Medicare Advantage plans amid heavy insurance industry lobbying.
"We are here to raise the alarm about Medicare Advantage. We are here to protect our Medicare," Sen. Elizabeth Warren (D-Mass.) said to robust applause.
"This year, for the very first time, more than half of all beneficiaries are enrolled in Medicare Advantage instead of traditional Medicare," she continued. "But Medicare Advantage substitutes private insurance companies for traditional Medicare coverage, and that private coverage is failing both Medicare beneficiaries and taxpayers."
"Not only do Medicare Advantage insurers rip off the government, they routinely deny care to patients who need it."
"It's all about the money," Warren said. "Private insurers are in Medicare Advantage to play games to extract more money from the government."
"Experts estimate that Medicare Advantage insurers will receive more than $75 billion in overpayments this year alone, and that's the real punch to the gut," she continued. "Not only do Medicare Advantage insurers rip off the government, they routinely deny care to patients who need it."
"Seniors and people with disabilities who rely on Medicare deserve better," the senator affirmed. "We can strengthen traditional Medicare, and by doing that, we can save money and we can use some of those savings to expand benefits, like hearing, dental, and vision... and add an out-of-pocket cap for all beneficiaries... and lower the eligibility age for Medicare."
"Medicare money should be spent to deliver services for people," Warren added, "not to boost profits for insurance."
Rep. Rosa DeLauro (D-Conn.) said that "it is time to call out so-called Medicare Advantage for what it is. It's private insurance that profits by denying coverage and using the name of Medicare to trick our seniors."
Universal healthcare activist Ady Barkan, who founded the advocacy group Be a Hero—an event sponsor—asserted that "healthcare is a human right, and Medicare should be a rock-solid guarantee to that fundamental right."
"It should be a beacon of mutual responsibility and solidarity in the wake of 50 years of neoliberal ideology—a pillar of love, standing tall in a world too often dominated by greed," he added. "Health insurance corporations are doing everything they can to destroy this vision. That's why Be A Hero is leading this week of action to fight back."
Like the lawmakers, Alex Lawson, executive director of the advocacy group Social Security Works, blasted "bad actors in Medicare Advantage" who he said "are delaying and denying the care seniors and people with disabilities need."
"Corporate insurance is designed to generate profits by delaying and denying care, harming and killing patients instead of providing care," Lawson added.
Among the patients who spoke at Tuesday's event were Jen Coffey, who described to the barriers she's faced while seeking lifesaving care for complex regional pain syndrome caused by breast cancer under a UnitedHealthcare Medicare Advantage plan.
"I'm tired of talking to insurance employees who get to override the care and medications my doctors order while having not one shred of medical knowledge to make that decision with," she said. "I want medical freedom where my care decisions are made by my providers and me, not a representative reading out of a manual or a computer algorithm."
Wendell Potter, who heads the Center for Health and Democracy, repeated the common refrain that "so-called Medicare Advantage is neither Medicare nor an advantage. It is simply another scheme by the insurance companies to line their pockets at the expense of consumers by denying and delaying care."
"The healthcare market is confusing for consumers and the misleading branding of calling private insurance Medicare only makes this worse," Potter stressed.
Zenei Triunfo-Cortez, a registered nurse and president of the National Nurses United union, implored Congress "to take immediate action to prevent delays and denials of care in Medicare Advantage" and "support improvements to traditional Medicare and the expansion of traditional Medicare to cover every person living in the United States."
Earlier this year, Democratic U.S. Reps Mark Pocan (Wis.), Jan Schakowsky (Ill.), and Ro Khanna (Calif.) reintroduced legislation that would prohibit insurance companies from using the word "Medicare" in their health plans.
Progressive lawmakers have also criticized President Joe Biden for delaying promised curbs on Medicare Advantage plans amid heavy insurance industry lobbying.
excerpt: Why Are We Letting the Red State Welfare Oligarchs Mooch Off Blue States?
Red states are mooching off the Blue states, using that essentially stolen tax money to reinvent the old Confederacy, “own the libs,” and wage “war on woke...”
THOM HARTMANN
JUN 27, 2023
America is rapidly bifurcating — becoming two nations — and one of the main drivers of the process is a federal system that encourages Red states to mooch off Blue states, using essentially stolen tax money to reinvent the old Confederacy, “own the libs,” and wage “war on woke.”
Most Red states have become oligarchic white supremacist medieval-like fiefdoms with obscene levels of often multigenerational wealth at the top, extreme poverty at the bottom, and working people, women, and minorities kept in subordinate roles through explicit government and corporate policy.
In this, these Red states are following the once-classic European and later Southern US tradition of a patriarchal, hierarchical society run by male kings, nobles, plantation masters, and wealthy churchmen, with all the work done by serfs, slaves, women, or impoverished wage-slaves.
Frederick Douglass, who was born into Southern slavery, described the South as “a little nation by itself, having its own language, its own rules, regulations, and customs.”
Fewer than 2000 families — six-tenths of one percent of the Southern population — owned more than 50 enslaved people and ruled the oligarchy that we call the Confederacy with an iron fist. The 75 percent of white people in the South during that era who did not own any enslaved persons generally lived in deep poverty.
---
Today’s version of yesteryear’s plantation owners are called CEOs, hedge and vulture fund managers, and the morbidly rich.
They use the power of political bribery given them by five corrupt Republicans on the Supreme Court — with Clarence Thomas’ tie-breaking Citizens United vote on behalf of his sugar daddy Harlan Crow — to lord over their Red states, regardless of the will of those states’ citizens.
Working people in Red states are kept in poverty by “Right to Work for Less” laws based on the GOP’s 1947 Taft-Hartley Act (passed over Harry Truman’s veto), deprived of healthcare by their governors’ refusal to expand Medicaid to cover all low-wage working people, and draconian cuts along with byzantine bureaucratic obstacles to getting state aid ranging from housing support to food stamps to subsidized daycare to unemployment insurance.
Women are returning to chattel status in Red states, the property of their fathers or husbands, their bondage enforced by bounty systems and the threat of imprisonment should they try to assert agency over their own bodies.
Multiple Red states have outlawed teaching the actual history of slavery, Jim Crow, and the ongoing violence and persecution suffered by racial, gender, and religious minorities. Many are using government tax revenues to subsidize typically-all-white “Christian” religious schools, a trend spreading across Republican-controlled states like an out-of-control fungus.
Almost a dozen GOP-controlled states have passed laws criminalizing protests against Republican policies, and, writes The Atlantic quoting the International Center for Non-for-Profit Law, “several of those states have simultaneously provided civil or criminal protection for drivers who hit [and kill] protesters…”
---
Robert E. Lee and Jefferson Davis no longer rule the South on behalf of the notorious “2000 families” but their spiritual heirs, today’s Red state governors, would make them proud with their restrictions on voting, encouragement of armed white supremacist vigilantism, and state tax subsidies for American oligarch billionaires and their corporate empires.
Georgia Governor Brian Kemp — whose family, investigative reporter Greg Palast discovered, was the first to bring enslaved Africans to Georgia — famously signed a law codifying 10-hour voting lines in Black neighborhoods and giving his state the power to throw people off voting rolls without justification in a closed-door ceremony before a painting of one of the 2000 families’ Georgia slave plantations.
It’s said that Red states are trying to take us back to the 1950s. In fact, most are shooting for the 1870s, when women were the property of their husbands, poor children worked 12-hour days for pennies, queer people were invisible, and Black, Asian, Native American, and Hispanic people could be murdered with impunity.
---
In this, Blue states are following the vision and values laid out by several of America’s Founders who helped create the first nation in world history espousing the idea that nobody should have to live in poverty and an important role of government was to prevent people from falling through the cracks.
Thomas Paine, for example, laid out a proposal for what we today call Social Security in his pamphlet Agrarian Justice, and Thomas Jefferson created the nation’s first totally free university (the University of Virginia). Half the delegates to the Constitutional Convention spoke against slavery as a curse against our nation’s ideals, although it took a bloody war to end that “peculiar institution” held so tightly in the grip of the 2000 families.
Even in the first years of our republic, the founding generation understood that caring for the public good was an essential function of government, as referenced in the preamble to the Constitution. President George Washington signed the first legislation providing federal funds for poorhouses that including food, clothing, shelter, and medical care along with job training.
Three decades later, when the legislation was up for renewal, President James Madison (the “Father of the Constitution”) vetoed a provision proposed by Southern states that would have cycled those revenues through local churches, writing in his veto message that federal funding of churches — even for charitable antipoverty purposes the government supported — “would be a precedent for giving to religious societies as such a legal agency in carrying into effect a public and civil duty.”
So, how did we get here and what can we do about it?
Ironically, it turns out that Red states are able to carry out their campaign of reviving the old Confederacy and its values of patriarchy, hierarchy, and wage slavery in large part because Blue states are subsidizing them.
While Donald Trump carried 2,497 counties in the 2020 election compared to Joe Biden’s 477 counties, the Blue counties Biden carried generate a total of 71 percent of the nation’s economic activity.
The fiscal health of Blue states and counties — along with widespread Red state poverty exacerbated by Red State tax breaks for the morbidly rich — accounts for the bizarre reality that Blue states are subsidizing the brutal and retrograde policies of Red states.
A March, 2023 report from the Rockefeller Institute lays out how glaring this appropriation of Blue state revenue is by the Red states. In just New York, for example, they write that over a five year period:
“New York taxpayers have given $142.6 billion more to the federal government than New York residents have received back in federal spending.”
As the Associated Press summarized in an article titled AP FACT CHECK: Blue high-tax states fund red low-tax states:
“Mississippi received $2.13 for every tax dollar the state sent to Washington in 2015, according to the Rockefeller study. West Virginia received $2.07, Kentucky got $1.90 and South Carolina got $1.71.
“Meanwhile, New Jersey received 74 cents in federal spending for tax every dollar the state sent to Washington. New York received 81 cents, Connecticut received 82 cents and Massachusetts received 83 cents.
“California fared a bit better than other blue states. It received 96 cents for every dollar the state sent to Washington.”
---
Federal policy fails to equalize revenue flowing to Red states against the tax money they send Washington DC. This loophole in US tax law is driving this bizarre process where citizens in Blue states are forced by law to pay for all-white “Christian” academies, enforcement of abortion restrictions, persecution of asylum seekers and immigrants, and political attacks on queer people.
Red state “welfare queen” governors and legislatures are going to happily continue this grift as long as we let them get away with it. Congress should pass legislation mandating that Red state revenues to DC must at least match 90 percent of the money they get back; 100 percent would be better, and help hugely with our nation’s budget deficit.
It’s time to end Red state welfare!
Most Red states have become oligarchic white supremacist medieval-like fiefdoms with obscene levels of often multigenerational wealth at the top, extreme poverty at the bottom, and working people, women, and minorities kept in subordinate roles through explicit government and corporate policy.
In this, these Red states are following the once-classic European and later Southern US tradition of a patriarchal, hierarchical society run by male kings, nobles, plantation masters, and wealthy churchmen, with all the work done by serfs, slaves, women, or impoverished wage-slaves.
Frederick Douglass, who was born into Southern slavery, described the South as “a little nation by itself, having its own language, its own rules, regulations, and customs.”
Fewer than 2000 families — six-tenths of one percent of the Southern population — owned more than 50 enslaved people and ruled the oligarchy that we call the Confederacy with an iron fist. The 75 percent of white people in the South during that era who did not own any enslaved persons generally lived in deep poverty.
---
Today’s version of yesteryear’s plantation owners are called CEOs, hedge and vulture fund managers, and the morbidly rich.
They use the power of political bribery given them by five corrupt Republicans on the Supreme Court — with Clarence Thomas’ tie-breaking Citizens United vote on behalf of his sugar daddy Harlan Crow — to lord over their Red states, regardless of the will of those states’ citizens.
Working people in Red states are kept in poverty by “Right to Work for Less” laws based on the GOP’s 1947 Taft-Hartley Act (passed over Harry Truman’s veto), deprived of healthcare by their governors’ refusal to expand Medicaid to cover all low-wage working people, and draconian cuts along with byzantine bureaucratic obstacles to getting state aid ranging from housing support to food stamps to subsidized daycare to unemployment insurance.
Women are returning to chattel status in Red states, the property of their fathers or husbands, their bondage enforced by bounty systems and the threat of imprisonment should they try to assert agency over their own bodies.
Multiple Red states have outlawed teaching the actual history of slavery, Jim Crow, and the ongoing violence and persecution suffered by racial, gender, and religious minorities. Many are using government tax revenues to subsidize typically-all-white “Christian” religious schools, a trend spreading across Republican-controlled states like an out-of-control fungus.
Almost a dozen GOP-controlled states have passed laws criminalizing protests against Republican policies, and, writes The Atlantic quoting the International Center for Non-for-Profit Law, “several of those states have simultaneously provided civil or criminal protection for drivers who hit [and kill] protesters…”
---
Robert E. Lee and Jefferson Davis no longer rule the South on behalf of the notorious “2000 families” but their spiritual heirs, today’s Red state governors, would make them proud with their restrictions on voting, encouragement of armed white supremacist vigilantism, and state tax subsidies for American oligarch billionaires and their corporate empires.
Georgia Governor Brian Kemp — whose family, investigative reporter Greg Palast discovered, was the first to bring enslaved Africans to Georgia — famously signed a law codifying 10-hour voting lines in Black neighborhoods and giving his state the power to throw people off voting rolls without justification in a closed-door ceremony before a painting of one of the 2000 families’ Georgia slave plantations.
It’s said that Red states are trying to take us back to the 1950s. In fact, most are shooting for the 1870s, when women were the property of their husbands, poor children worked 12-hour days for pennies, queer people were invisible, and Black, Asian, Native American, and Hispanic people could be murdered with impunity.
---
In this, Blue states are following the vision and values laid out by several of America’s Founders who helped create the first nation in world history espousing the idea that nobody should have to live in poverty and an important role of government was to prevent people from falling through the cracks.
Thomas Paine, for example, laid out a proposal for what we today call Social Security in his pamphlet Agrarian Justice, and Thomas Jefferson created the nation’s first totally free university (the University of Virginia). Half the delegates to the Constitutional Convention spoke against slavery as a curse against our nation’s ideals, although it took a bloody war to end that “peculiar institution” held so tightly in the grip of the 2000 families.
Even in the first years of our republic, the founding generation understood that caring for the public good was an essential function of government, as referenced in the preamble to the Constitution. President George Washington signed the first legislation providing federal funds for poorhouses that including food, clothing, shelter, and medical care along with job training.
Three decades later, when the legislation was up for renewal, President James Madison (the “Father of the Constitution”) vetoed a provision proposed by Southern states that would have cycled those revenues through local churches, writing in his veto message that federal funding of churches — even for charitable antipoverty purposes the government supported — “would be a precedent for giving to religious societies as such a legal agency in carrying into effect a public and civil duty.”
So, how did we get here and what can we do about it?
Ironically, it turns out that Red states are able to carry out their campaign of reviving the old Confederacy and its values of patriarchy, hierarchy, and wage slavery in large part because Blue states are subsidizing them.
While Donald Trump carried 2,497 counties in the 2020 election compared to Joe Biden’s 477 counties, the Blue counties Biden carried generate a total of 71 percent of the nation’s economic activity.
The fiscal health of Blue states and counties — along with widespread Red state poverty exacerbated by Red State tax breaks for the morbidly rich — accounts for the bizarre reality that Blue states are subsidizing the brutal and retrograde policies of Red states.
A March, 2023 report from the Rockefeller Institute lays out how glaring this appropriation of Blue state revenue is by the Red states. In just New York, for example, they write that over a five year period:
“New York taxpayers have given $142.6 billion more to the federal government than New York residents have received back in federal spending.”
As the Associated Press summarized in an article titled AP FACT CHECK: Blue high-tax states fund red low-tax states:
“Mississippi received $2.13 for every tax dollar the state sent to Washington in 2015, according to the Rockefeller study. West Virginia received $2.07, Kentucky got $1.90 and South Carolina got $1.71.
“Meanwhile, New Jersey received 74 cents in federal spending for tax every dollar the state sent to Washington. New York received 81 cents, Connecticut received 82 cents and Massachusetts received 83 cents.
“California fared a bit better than other blue states. It received 96 cents for every dollar the state sent to Washington.”
---
Federal policy fails to equalize revenue flowing to Red states against the tax money they send Washington DC. This loophole in US tax law is driving this bizarre process where citizens in Blue states are forced by law to pay for all-white “Christian” academies, enforcement of abortion restrictions, persecution of asylum seekers and immigrants, and political attacks on queer people.
Red state “welfare queen” governors and legislatures are going to happily continue this grift as long as we let them get away with it. Congress should pass legislation mandating that Red state revenues to DC must at least match 90 percent of the money they get back; 100 percent would be better, and help hugely with our nation’s budget deficit.
It’s time to end Red state welfare!
America's "systemic racism" isn't just domestic: Consider who dies around the world in our wars
Nearly all the people killed in U.S. wars since 9/11 have been people of color — and we almost don't notice
By NORMAN SOLOMON - salon
Contributing Writer
PUBLISHED JUNE 27, 2023 5:30AM (EDT)
A recent Justice Department report concluded that "systemic" racial bias in the Minneapolis Police Department "made what happened to George Floyd possible." During the three years since a white police officer brutally murdered Floyd, nationwide discussions of systemic racism have extended well beyond focusing on law enforcement to also assess a range of other government functions. But such scrutiny comes to a halt at the water's edge — stopping short of probing whether racism has been a factor in U.S. military interventions overseas.
Hidden in plain sight is the fact that virtually all the people killed by U.S. firepower in the "war on terror" for more than two decades have been people of color. This notable fact goes unnoticed in a country where — in sharp contrast — racial aspects of domestic policies and outcomes are ongoing topics of public discourse.
Certainly, the U.S. does not attack a country because people of color live there. But when people of color live there, it is politically easier for U.S. leaders to subject them to warfare — because of institutional racism and often-unconscious prejudices that are common in the United States.
Racial inequities and injustice are painfully apparent in domestic contexts, from police and courts to legislative bodies, financial systems and economic structures. A nation so profoundly affected by individual and structural racism at home is apt to be affected by such racism in its approach to war.
Many Americans recognize that racism holds significant sway over their society and many of its institutions. Yet the extensive political debates and media coverage devoted to U.S. foreign policy and military affairs rarely even mention — let alone explore the implications of — the reality that the several hundred thousand civilians killed in America's "war on terror" have been almost entirely people of color.
The flip side of biases that facilitate public acceptance of making war on nonwhite people came to the fore when Russia invaded Ukraine in early 2022. News coverage included reporting that the war's victims "have blue eyes and blond hair" and "look like us," Los Angeles Times television critic Lorraine Ali noted. "Writers who'd previously addressed conflicts in the Gulf region, often with a focus on geopolitical strategy and employing moral abstractions, appeared to be empathizing for the first time with the plight of civilians."
Such empathy, all too often, is skewed by the race and ethnicity of those being killed. The Arab and Middle Eastern Journalists Association has deplored "the pervasive mentality in Western journalism of normalizing tragedy in parts of the world such as the Middle East, Africa, South Asia and Latin America. It dehumanizes and renders their experience with war as somehow normal and expected."
Persisting today is a modern version of what W.E.B. Du Bois called, 120 years ago, "the problem of the color line — the relation of the darker to the lighter races." Twenty-first century lineups of global power and geopolitical agendas have propelled the United States into seemingly endless warfare in countries where few white people live.
Racial, cultural and religious differences have made it far too easy for most Americans to think of the victims of U.S. war efforts in Iraq, Afghanistan, Syria, Libya and elsewhere as "the other." Their suffering is much more likely to be viewed as merely regrettable or inconsequential rather than heart-rending or unacceptable. What Du Bois called "the problem of the color line" keeps empathy to a minimum.
"The history of U.S. wars in Asia, the Middle East, Africa and Latin America has exuded a stench of white supremacy, discounting the value of lives at the other end of U.S. bullets, bombs and missiles," I concluded in my new book "War Made Invisible." "Yet racial factors in war-making decisions get very little mention in U.S. media and virtually none in the political world of officials in Washington."
At the same time, on the surface, Washington's foreign policy can seem to be a model of interracial connection. Like presidents before him, Joe Biden has reached out to foreign leaders of different races, religions and cultures — as when he fist-bumped Saudi Arabia's de facto ruler, Crown Prince Mohammed bin Salman, at their summit a year ago, while discarding professed human rights concerns in the process.
Overall, in America's political and media realms, the people of color who've suffered from U.S. warfare abroad have been relegated to a kind of psychological apartheid — separate, unequal, and implicitly not of much importance. And so, when the Pentagon's forces kill them, systemic racism makes it less likely that Americans will actually care.
Hidden in plain sight is the fact that virtually all the people killed by U.S. firepower in the "war on terror" for more than two decades have been people of color. This notable fact goes unnoticed in a country where — in sharp contrast — racial aspects of domestic policies and outcomes are ongoing topics of public discourse.
Certainly, the U.S. does not attack a country because people of color live there. But when people of color live there, it is politically easier for U.S. leaders to subject them to warfare — because of institutional racism and often-unconscious prejudices that are common in the United States.
Racial inequities and injustice are painfully apparent in domestic contexts, from police and courts to legislative bodies, financial systems and economic structures. A nation so profoundly affected by individual and structural racism at home is apt to be affected by such racism in its approach to war.
Many Americans recognize that racism holds significant sway over their society and many of its institutions. Yet the extensive political debates and media coverage devoted to U.S. foreign policy and military affairs rarely even mention — let alone explore the implications of — the reality that the several hundred thousand civilians killed in America's "war on terror" have been almost entirely people of color.
The flip side of biases that facilitate public acceptance of making war on nonwhite people came to the fore when Russia invaded Ukraine in early 2022. News coverage included reporting that the war's victims "have blue eyes and blond hair" and "look like us," Los Angeles Times television critic Lorraine Ali noted. "Writers who'd previously addressed conflicts in the Gulf region, often with a focus on geopolitical strategy and employing moral abstractions, appeared to be empathizing for the first time with the plight of civilians."
Such empathy, all too often, is skewed by the race and ethnicity of those being killed. The Arab and Middle Eastern Journalists Association has deplored "the pervasive mentality in Western journalism of normalizing tragedy in parts of the world such as the Middle East, Africa, South Asia and Latin America. It dehumanizes and renders their experience with war as somehow normal and expected."
Persisting today is a modern version of what W.E.B. Du Bois called, 120 years ago, "the problem of the color line — the relation of the darker to the lighter races." Twenty-first century lineups of global power and geopolitical agendas have propelled the United States into seemingly endless warfare in countries where few white people live.
Racial, cultural and religious differences have made it far too easy for most Americans to think of the victims of U.S. war efforts in Iraq, Afghanistan, Syria, Libya and elsewhere as "the other." Their suffering is much more likely to be viewed as merely regrettable or inconsequential rather than heart-rending or unacceptable. What Du Bois called "the problem of the color line" keeps empathy to a minimum.
"The history of U.S. wars in Asia, the Middle East, Africa and Latin America has exuded a stench of white supremacy, discounting the value of lives at the other end of U.S. bullets, bombs and missiles," I concluded in my new book "War Made Invisible." "Yet racial factors in war-making decisions get very little mention in U.S. media and virtually none in the political world of officials in Washington."
At the same time, on the surface, Washington's foreign policy can seem to be a model of interracial connection. Like presidents before him, Joe Biden has reached out to foreign leaders of different races, religions and cultures — as when he fist-bumped Saudi Arabia's de facto ruler, Crown Prince Mohammed bin Salman, at their summit a year ago, while discarding professed human rights concerns in the process.
Overall, in America's political and media realms, the people of color who've suffered from U.S. warfare abroad have been relegated to a kind of psychological apartheid — separate, unequal, and implicitly not of much importance. And so, when the Pentagon's forces kill them, systemic racism makes it less likely that Americans will actually care.
The Debt Limit Is Just One of America’s Six Worst Traditions
Believe it or not, the debt ceiling is an improvement on what the United States used to do.
Jon Schwarz - the intercept
May 20 2023, 6:00 a.m.
IMAGINE THAT YOUR family has a generations-long tradition that requires that for every 10th dinner, you search your neighbors’ trash cans like raccoons and eat whatever garbage you find.
Usually none of you asks why you do this. It’s just what you learned from your parents. But occasionally someone does some family research and finds out it originated in the early 1800s, when your great-great-great-great-great-great grandfather explained in his diary that he was creating this custom “so every feventh child will expire from famonella.” And you have to admit this still works, since every now and then one of your children dies from food-borne illness. Yet you keep on eating the garbage.
This is what American politics is like, except we have dozens of these aged traditions whose purpose is actively malevolent or simply serves no purpose at all. They nonetheless cling like barnacles to our life in the 21st century. We just can’t get our act together to get rid of them.
The debt ceiling plaguing Washington politics — and, potentially, poisoning the rest of us — is just one of at least six of these abominable ideas.
The Debt Limit
Believe it or not, the debt ceiling is an improvement on what the United States used to do. Congress once required the executive branch to get its permission to do any borrowing whatsoever and in fact, often specified all the details — i.e., how long the bonds would take to mature, what interest rate it would pay, etc.
This was a terrible way to run a country and to its credit Congress over the decades after World War I changed this awful system into another, slightly less awful one. Now Congress just limits the total borrowing by the government and lets the Treasury Department take care of the details.
But it still makes no sense. Congress has already ordered the executive branch to spend money on certain activities and also levy certain taxes. It’s contradictory and silly for Congress to also say that the government can only borrow a certain amount of money to make up whatever difference between the spending and taxes it itself has required.
It’s also dangerous. No one knows exactly what will happen if the debt limit is breached, and the Biden administration then fails to use the various options it has to keep paying the bills. But it definitely would be extremely unpleasant.
In the past, this danger has never manifested in reality, for good reason. A debt limit imbroglio would immediately cause the most pain to the financial and corporate interests traditionally represented by the Republican Party. As some people have observed, the GOP’s refusal to raise the debt limit unconditionally is like a crazed man pointing a gun at his head and saying, “Give me what I want, or I’ll shoot!”
But there are two problems with this metaphor. First, a strong faction of the Republican Party appears to have convinced itself that shooting itself in the head wouldn’t hurt that much. Second, the rest of the country is the GOP’s metaphorical conjoined twin. If that faction decides to commit suicide, it’s going to cause severe problems for us too.
Pretty much the only other country that has created this pointless problem for itself is Denmark. I lived there briefly when I was 6 years old, and while they broadcast American shows on TV, they didn’t have ads to accompany them and just filled up the extra time with footage of goldfish swimming around in a bowl. Keeping the debt limit will inevitably lead us down the path to this kind of horrifying socialism.
The Electoral College
The U.S. right constantly proclaims that the Electoral College is a sign of the enduring wisdom of our founders, who created it to give smaller rural states a voice in the choice of the president.
This means that they must also believe the Founding Fathers were dolts with absolutely no idea what they were doing. Of the first five presidents, four of them were from Virginia, and all four served two terms. Meanwhile, the only exception, John Adams from Massachusetts, was in office for just four years. This means that during the first 36 years of presidents, the chief executive was a Virginian for 32 of them. And during this period, Virginia was either America’s biggest or second-biggest state.
However, America’s founders were not in fact incredibly incompetent. The actual rationale for the Electoral College was explained by James Madison in 1787 at the Constitutional Convention. Madison said he believed the best way to choose a president would be by popular vote, which “would be as likely as any that could be devised to produce an Executive Magistrate of distinguished Character.”
But “there was one difficulty however of a serious nature attending an immediate choice by the people.” This was, Madison said, the fact that Southern states generally had stricter limits on which men could vote, and more of their population was enslaved. This meant that the South “could have no influence in [a popular] election” and so would never support a Constitution that used this method. Hence the Electoral College kludge was necessary to get the U.S. off the ground.
The Senate
Madison, however, was by no means all-in on democracy. As he also said at the Constitutional Convention, he believed that for the new country to endure, part of the government had to represent the “invaluable interests” of large, rich landowners and make sure the rabble couldn’t vote to take their wealth away. Part of the structure they were creating in Philadelphia, Madison believed, “ought to be so constituted as to protect the minority of the opulent against the majority. The senate, therefore, ought to be this body.”
The Constitution originally ordained that senators would be elected by state legislatures. This was altered by the 17th Amendment, and senators have been popularly elected since 1913. Nonetheless, Madison’s scheme continues to work surprisingly well, with the Senate still being the place where the political hopes of Americans go to die.
One solution here would be for the California legislature to wait until Democrats control the federal House and Senate. Then California could separate itself into 68 heavily gerrymandered blue states with Wyoming-sized populations. Congress could admit all the new states and their 136 Democratic senators into the union — and then easily block any red states from trying something similar. This would be totally constitutional and be worth it just to get the U.S. right to stop talking about the wisdom of the founders.
The Filibuster
The Senate is inherently against popular democracy. But those running it have long believed that it isn’t anti-democracy enough and so have supported the supermajority requirements of the filibuster. Between 1917 and 1994, 30 bills were stopped from passage thanks to the filibuster. Half of these were civil rights measures, including anti-lynching measures, the Civil Rights Act of 1957, and attempts to outlaw poll taxes. This is why in 2020, Barack Obama called the filibuster a “Jim Crow relic.” But neither he nor any prominent Democrats has put much energy into getting rid of it.
“First Past the Post” Voting
The way voting generally works in the U.S. is that whoever gets the most votes wins. This is simple, easy to understand, and bad. It naturally creates a two-party duopoly, since each party can accurately harangue any miscreants within its ranks tempted to vote for a third party that they will simply act as spoilers — i.e., if they vote for their first-choice candidate, they’re merely making it more likely that their last-choice candidate will win.
There are several excellent solutions to this problem, including instant-runoff voting and — for House elections on a state and federal level — multimember districts. The problem here is that both the Democratic and Republican parties love the current setup and are not interested in creating more competition for themselves just because it would be good for America.
Most political commentators don’t have the courage to tell you this, but I do: All of our current suffering is the fault of the Florida Panhandle.
The Florida Panhandle
Geographically and culturally, the Florida Panhandle makes no sense. On any sensible map, it would belong to Alabama. But it’s part of Florida thanks to ancient colonial struggles between the United Kingdom, Spain, and France — struggles that happened before there even was a United States.
If Florida didn’t have its conservative panhandle, Al Gore would have easily beaten George W. Bush in Florida in the 2000 election and become president. The Bush administration resolutely ignored all the warnings from its intelligence agencies about the coming 9/11 attacks, but Gore almost certainly would have taken the threat seriously enough to disrupt the terrorist plot. No 9/11, no Iraq War. And no Bush presidency, no majority on the Supreme Court for Citizens United and the ensuing catastrophic surge of cash into the U.S. political system. Moreover, the 2007-2008 economic disaster would probably not have occurred or would have been significantly less severe.
Instead the Florida Panhandle gave us our current country, which is constantly going haywire. It also gave us Errol Morris’s documentary “Vernon, Florida,” originally titled “Nub City,” about a small town where many residents have amputated their own limbs in order to collect dismemberment insurance.
SO THAT’S THAT: six ghastly political ideas that do nothing but torment us. We’re currently experiencing this with the debt ceiling and may soon feel it to a far greater degree. Yet we don’t have it in us to get rid of any of them. It’s enough to make you think the most powerful force in human society isn’t the normal candidates like money or sex, but inertia.
Usually none of you asks why you do this. It’s just what you learned from your parents. But occasionally someone does some family research and finds out it originated in the early 1800s, when your great-great-great-great-great-great grandfather explained in his diary that he was creating this custom “so every feventh child will expire from famonella.” And you have to admit this still works, since every now and then one of your children dies from food-borne illness. Yet you keep on eating the garbage.
This is what American politics is like, except we have dozens of these aged traditions whose purpose is actively malevolent or simply serves no purpose at all. They nonetheless cling like barnacles to our life in the 21st century. We just can’t get our act together to get rid of them.
The debt ceiling plaguing Washington politics — and, potentially, poisoning the rest of us — is just one of at least six of these abominable ideas.
The Debt Limit
Believe it or not, the debt ceiling is an improvement on what the United States used to do. Congress once required the executive branch to get its permission to do any borrowing whatsoever and in fact, often specified all the details — i.e., how long the bonds would take to mature, what interest rate it would pay, etc.
This was a terrible way to run a country and to its credit Congress over the decades after World War I changed this awful system into another, slightly less awful one. Now Congress just limits the total borrowing by the government and lets the Treasury Department take care of the details.
But it still makes no sense. Congress has already ordered the executive branch to spend money on certain activities and also levy certain taxes. It’s contradictory and silly for Congress to also say that the government can only borrow a certain amount of money to make up whatever difference between the spending and taxes it itself has required.
It’s also dangerous. No one knows exactly what will happen if the debt limit is breached, and the Biden administration then fails to use the various options it has to keep paying the bills. But it definitely would be extremely unpleasant.
In the past, this danger has never manifested in reality, for good reason. A debt limit imbroglio would immediately cause the most pain to the financial and corporate interests traditionally represented by the Republican Party. As some people have observed, the GOP’s refusal to raise the debt limit unconditionally is like a crazed man pointing a gun at his head and saying, “Give me what I want, or I’ll shoot!”
But there are two problems with this metaphor. First, a strong faction of the Republican Party appears to have convinced itself that shooting itself in the head wouldn’t hurt that much. Second, the rest of the country is the GOP’s metaphorical conjoined twin. If that faction decides to commit suicide, it’s going to cause severe problems for us too.
Pretty much the only other country that has created this pointless problem for itself is Denmark. I lived there briefly when I was 6 years old, and while they broadcast American shows on TV, they didn’t have ads to accompany them and just filled up the extra time with footage of goldfish swimming around in a bowl. Keeping the debt limit will inevitably lead us down the path to this kind of horrifying socialism.
The Electoral College
The U.S. right constantly proclaims that the Electoral College is a sign of the enduring wisdom of our founders, who created it to give smaller rural states a voice in the choice of the president.
This means that they must also believe the Founding Fathers were dolts with absolutely no idea what they were doing. Of the first five presidents, four of them were from Virginia, and all four served two terms. Meanwhile, the only exception, John Adams from Massachusetts, was in office for just four years. This means that during the first 36 years of presidents, the chief executive was a Virginian for 32 of them. And during this period, Virginia was either America’s biggest or second-biggest state.
However, America’s founders were not in fact incredibly incompetent. The actual rationale for the Electoral College was explained by James Madison in 1787 at the Constitutional Convention. Madison said he believed the best way to choose a president would be by popular vote, which “would be as likely as any that could be devised to produce an Executive Magistrate of distinguished Character.”
But “there was one difficulty however of a serious nature attending an immediate choice by the people.” This was, Madison said, the fact that Southern states generally had stricter limits on which men could vote, and more of their population was enslaved. This meant that the South “could have no influence in [a popular] election” and so would never support a Constitution that used this method. Hence the Electoral College kludge was necessary to get the U.S. off the ground.
The Senate
Madison, however, was by no means all-in on democracy. As he also said at the Constitutional Convention, he believed that for the new country to endure, part of the government had to represent the “invaluable interests” of large, rich landowners and make sure the rabble couldn’t vote to take their wealth away. Part of the structure they were creating in Philadelphia, Madison believed, “ought to be so constituted as to protect the minority of the opulent against the majority. The senate, therefore, ought to be this body.”
The Constitution originally ordained that senators would be elected by state legislatures. This was altered by the 17th Amendment, and senators have been popularly elected since 1913. Nonetheless, Madison’s scheme continues to work surprisingly well, with the Senate still being the place where the political hopes of Americans go to die.
One solution here would be for the California legislature to wait until Democrats control the federal House and Senate. Then California could separate itself into 68 heavily gerrymandered blue states with Wyoming-sized populations. Congress could admit all the new states and their 136 Democratic senators into the union — and then easily block any red states from trying something similar. This would be totally constitutional and be worth it just to get the U.S. right to stop talking about the wisdom of the founders.
The Filibuster
The Senate is inherently against popular democracy. But those running it have long believed that it isn’t anti-democracy enough and so have supported the supermajority requirements of the filibuster. Between 1917 and 1994, 30 bills were stopped from passage thanks to the filibuster. Half of these were civil rights measures, including anti-lynching measures, the Civil Rights Act of 1957, and attempts to outlaw poll taxes. This is why in 2020, Barack Obama called the filibuster a “Jim Crow relic.” But neither he nor any prominent Democrats has put much energy into getting rid of it.
“First Past the Post” Voting
The way voting generally works in the U.S. is that whoever gets the most votes wins. This is simple, easy to understand, and bad. It naturally creates a two-party duopoly, since each party can accurately harangue any miscreants within its ranks tempted to vote for a third party that they will simply act as spoilers — i.e., if they vote for their first-choice candidate, they’re merely making it more likely that their last-choice candidate will win.
There are several excellent solutions to this problem, including instant-runoff voting and — for House elections on a state and federal level — multimember districts. The problem here is that both the Democratic and Republican parties love the current setup and are not interested in creating more competition for themselves just because it would be good for America.
Most political commentators don’t have the courage to tell you this, but I do: All of our current suffering is the fault of the Florida Panhandle.
The Florida Panhandle
Geographically and culturally, the Florida Panhandle makes no sense. On any sensible map, it would belong to Alabama. But it’s part of Florida thanks to ancient colonial struggles between the United Kingdom, Spain, and France — struggles that happened before there even was a United States.
If Florida didn’t have its conservative panhandle, Al Gore would have easily beaten George W. Bush in Florida in the 2000 election and become president. The Bush administration resolutely ignored all the warnings from its intelligence agencies about the coming 9/11 attacks, but Gore almost certainly would have taken the threat seriously enough to disrupt the terrorist plot. No 9/11, no Iraq War. And no Bush presidency, no majority on the Supreme Court for Citizens United and the ensuing catastrophic surge of cash into the U.S. political system. Moreover, the 2007-2008 economic disaster would probably not have occurred or would have been significantly less severe.
Instead the Florida Panhandle gave us our current country, which is constantly going haywire. It also gave us Errol Morris’s documentary “Vernon, Florida,” originally titled “Nub City,” about a small town where many residents have amputated their own limbs in order to collect dismemberment insurance.
SO THAT’S THAT: six ghastly political ideas that do nothing but torment us. We’re currently experiencing this with the debt ceiling and may soon feel it to a far greater degree. Yet we don’t have it in us to get rid of any of them. It’s enough to make you think the most powerful force in human society isn’t the normal candidates like money or sex, but inertia.
Refined carbs and red meat driving global rise in type 2 diabetes, study says
By Sandee LaMotte, CNN
Updated 12:02 PM EDT, Mon April 17, 2023
Gobbling up too many refined wheat and rice products, along with eating too few whole grains, is fueling the growth of new cases of type 2 diabetes worldwide, according to a new study that models data through 2018.
“Our study suggests poor carbohydrate quality is a leading driver of diet-attributable type 2 diabetes globally,” says senior author Dr. Dariush Mozaffarian, a professor of nutrition at Tufts University and professor of medicine at Tufts School of Medicine in Boston, in a statement.
Another key factor: People are eating far too much red and processed meats, such as bacon, sausage, salami and the like, the study said. Those three factors — eating too few whole grains and too many processed grains and meats — were the primary drivers of over 14 million new cases of type 2 diabetes in 2018, according to the study, which was published Monday in the journal Nature Medicine.
In fact, the study estimated 7 out of 10 cases of type 2 diabetes worldwide in 2018 were linked to poor food choices.
“These new findings reveal critical areas for national and global focus to improve nutrition and reduce devastating burdens of diabetes,” said Mozaffarian, who is also the editor in chief of the Tufts Health & Nutrition Letter.
Too many processed foods
Mozaffarian and his team developed a research model of dietary intake between 1990 and 2018 and applied it to 184 countries. Compared with 1990, there were 8.6 million more cases of type 2 diabetes due to poor diet in 2018, the study found.
Researchers found eating too many unhealthy foods was more of a driver of type 2 diabetes on a global level than a lack of eating wholesome foods, especially for men compared with women, younger compared to older adults, and in urban versus rural residents.
Over 60% of the total global diet-attributable cases of the disease were due to excess intake of just six harmful dietary habits: eating too much refined rice, wheat and potatoes; too many processed and unprocessed red meats; and drinking too many sugar-sweetened beverages and fruit juice.
Inadequate intake of five protective dietary factors — fruits, nonstarchy vegetables, nuts, seeds, whole grains and yogurt — was responsible for just over 39% of the new cases.
People in Poland and Russia, where diets tend to focus on potatoes and red and processed meat, and other countries in Eastern and Central Europe as well as Central Asia, had the highest percentage of new type 2 diabetes cases linked to diet.
Colombia, Mexico and other countries in Latin America and the Caribbean also had high numbers of new cases, which researchers said could be due to a reliance on sugary drinks and processed meat, as well as a low intake of whole grains.
“Our modeling approach does not prove causation, and our findings should be considered as estimates of risk,” the authors wrote.
“Our study suggests poor carbohydrate quality is a leading driver of diet-attributable type 2 diabetes globally,” says senior author Dr. Dariush Mozaffarian, a professor of nutrition at Tufts University and professor of medicine at Tufts School of Medicine in Boston, in a statement.
Another key factor: People are eating far too much red and processed meats, such as bacon, sausage, salami and the like, the study said. Those three factors — eating too few whole grains and too many processed grains and meats — were the primary drivers of over 14 million new cases of type 2 diabetes in 2018, according to the study, which was published Monday in the journal Nature Medicine.
In fact, the study estimated 7 out of 10 cases of type 2 diabetes worldwide in 2018 were linked to poor food choices.
“These new findings reveal critical areas for national and global focus to improve nutrition and reduce devastating burdens of diabetes,” said Mozaffarian, who is also the editor in chief of the Tufts Health & Nutrition Letter.
Too many processed foods
Mozaffarian and his team developed a research model of dietary intake between 1990 and 2018 and applied it to 184 countries. Compared with 1990, there were 8.6 million more cases of type 2 diabetes due to poor diet in 2018, the study found.
Researchers found eating too many unhealthy foods was more of a driver of type 2 diabetes on a global level than a lack of eating wholesome foods, especially for men compared with women, younger compared to older adults, and in urban versus rural residents.
Over 60% of the total global diet-attributable cases of the disease were due to excess intake of just six harmful dietary habits: eating too much refined rice, wheat and potatoes; too many processed and unprocessed red meats; and drinking too many sugar-sweetened beverages and fruit juice.
Inadequate intake of five protective dietary factors — fruits, nonstarchy vegetables, nuts, seeds, whole grains and yogurt — was responsible for just over 39% of the new cases.
People in Poland and Russia, where diets tend to focus on potatoes and red and processed meat, and other countries in Eastern and Central Europe as well as Central Asia, had the highest percentage of new type 2 diabetes cases linked to diet.
Colombia, Mexico and other countries in Latin America and the Caribbean also had high numbers of new cases, which researchers said could be due to a reliance on sugary drinks and processed meat, as well as a low intake of whole grains.
“Our modeling approach does not prove causation, and our findings should be considered as estimates of risk,” the authors wrote.
A new study links 45 health problems to "free sugar." Here's what that means, and how to avoid it
Free sugar has been linked to cancer, heart attacks, diabetes, obesity, strokes and more ailments
By MATTHEW ROZSA - salon
Staff Writer
PUBLISHED APRIL 7, 2023 1:56PM (EDT)
Sugar is bad for you" is an old health axiom, but the depths to which sugar can harm one's body has perhaps not yet been fully tabulated. Indeed, according to a new study by the prestigious medical journal BMJ, sugar consumption is linked to 45 different ailments. Yes, you read that right: forty-five different health problems all exacerbated by or correlated with eating that sweet white powder.
From obesity and type 2 diabetes to seven types of cancer and 18 endocrine/metabolic outcomes, sugar has already been found to have addictive qualities, so much so that it is common for people to binge on it.
But not all sugars are created alike, and the bad stuff is something that's known as "free sugar." According to Dr. James DiNicolantonio, the Associate Editor of British Medical Journal's (BMJ) Open Heart and a cardiovascular research scientist and doctor of pharmacy at Saint Luke's Mid America Heart Institute in Kansas City, this refers to any sugar that does not come from a whole food (or a food that has been processed and refined as little as possible). This is contrast to sugars from foods that have been in our diet for a long time, and which occur naturally typically — say, in fruits like apples or vegetables like carrots.
As such, the white crystalline sugar you put in coffee or the high fructose corn syrup in your soda and fast food has free or added sugars. A delicious and untampered orange or banana, which humans have long been accustomed to eat, does not have them.
This distinction, however seemingly slight, makes a world of difference when it comes to your health. As the BMJ study revealed after reviewing 73 meta-analyses of 8,601 unique scientific articles about added sugar, they found significant links with 45 different adverse health outcomes. These include asthma, cancer, depression, type 2 diabetes, gout, heart attacks, hypertension, obesity, strokes and tooth decay.
As DiNicolantonio explained to Salon, added sugars are linked to a wide range of health issues because they appear in three-fourths of packaged foods, including soft drinks and fruit juices, and comprise anywhere from one-fourth to two-fifths of the total caloric intake of children and roughly one-seventh of the total caloric intake of adults. This "overconsumption drives type 2 diabetes, fatty liver, obesity, kidney disease and cardiovascular disease," DiNicolantonio told Salon.
Most people who consume over 30 to 40 grams of added sugar consistently will "increase their risk for numerous health issues," DiNicolantonio concluded. "For those who are more active (i.e. athletes) they can get away with eating more sugar, but ideally most of their sugar intake should come from whole food."
So why is sugar so versatile in its ability to damage the body? Part of it is because sugars are not typically consumed alone, in the form of cubes or powder; that means studies into sugar consumption can't completely isolate the substance from the other things they are often mixed with. In other words, we are not really talking about just one substance. Hence, any study about the impact of added sugars on human health is effectively discussing all of the common unhealthy foods that usually join those added sugars.
"If you think about it, added sugar really isn't a single substance," Dr. Alexandra DiFeliceantonio from Fralin Biomedical Research Institute at Virginia Tech Carillon told Salon by email. DiFeliceantonio was not involved in the study. "No one is eating just tablespoons of sugar. They are most likely eating that added sugar in highly processed or ultra-processed foods. Those foods may contain other additives, high-levels of fat, or other substances that are linked to poor health outcomes. So, it's not that this one substance, sugar, is causing all these problems, but that this substance is present in a whole host of foods that are contributing to these health issues."
DiFeliceantonio clarified that sugar could be causing some health problems on its own, but noted that it is "more likely a combination of factors."
Dr. Nicole Avena, an assistant professor of neuroscience at Mount Sinai Medical School and a visiting professor of health psychology at Princeton University who studies human health but was also not involved in the study, offered advice on how people who are health-conscious can protect themselves from added sugars.
"I think it's really a good idea to do a food diary," Avena suggested. "Do one or two days where you eat like you typically would and write down every single thing you eat — down to what condiments you're using — and you can really get a clear picture then of how much sugar you're actually consuming. And a lot of people are shocked when they do this because they think they're eating a relatively healthy diet. But when you start to break it down and look at the salad dressings, look at the condiments, even things like nuts that people think of as a healthy snack, but it often has added sugar in there."
DiNicolantonio also urged consumers to consider healthy substitutes for their favorite sweets.
"The best way to beat a sugar craving is to find healthy alternatives that provide a little natural sugar — like berries, an apple, or even a little raw honey or maple syrup," DiNicolantonio opined.
From obesity and type 2 diabetes to seven types of cancer and 18 endocrine/metabolic outcomes, sugar has already been found to have addictive qualities, so much so that it is common for people to binge on it.
But not all sugars are created alike, and the bad stuff is something that's known as "free sugar." According to Dr. James DiNicolantonio, the Associate Editor of British Medical Journal's (BMJ) Open Heart and a cardiovascular research scientist and doctor of pharmacy at Saint Luke's Mid America Heart Institute in Kansas City, this refers to any sugar that does not come from a whole food (or a food that has been processed and refined as little as possible). This is contrast to sugars from foods that have been in our diet for a long time, and which occur naturally typically — say, in fruits like apples or vegetables like carrots.
As such, the white crystalline sugar you put in coffee or the high fructose corn syrup in your soda and fast food has free or added sugars. A delicious and untampered orange or banana, which humans have long been accustomed to eat, does not have them.
This distinction, however seemingly slight, makes a world of difference when it comes to your health. As the BMJ study revealed after reviewing 73 meta-analyses of 8,601 unique scientific articles about added sugar, they found significant links with 45 different adverse health outcomes. These include asthma, cancer, depression, type 2 diabetes, gout, heart attacks, hypertension, obesity, strokes and tooth decay.
As DiNicolantonio explained to Salon, added sugars are linked to a wide range of health issues because they appear in three-fourths of packaged foods, including soft drinks and fruit juices, and comprise anywhere from one-fourth to two-fifths of the total caloric intake of children and roughly one-seventh of the total caloric intake of adults. This "overconsumption drives type 2 diabetes, fatty liver, obesity, kidney disease and cardiovascular disease," DiNicolantonio told Salon.
Most people who consume over 30 to 40 grams of added sugar consistently will "increase their risk for numerous health issues," DiNicolantonio concluded. "For those who are more active (i.e. athletes) they can get away with eating more sugar, but ideally most of their sugar intake should come from whole food."
So why is sugar so versatile in its ability to damage the body? Part of it is because sugars are not typically consumed alone, in the form of cubes or powder; that means studies into sugar consumption can't completely isolate the substance from the other things they are often mixed with. In other words, we are not really talking about just one substance. Hence, any study about the impact of added sugars on human health is effectively discussing all of the common unhealthy foods that usually join those added sugars.
"If you think about it, added sugar really isn't a single substance," Dr. Alexandra DiFeliceantonio from Fralin Biomedical Research Institute at Virginia Tech Carillon told Salon by email. DiFeliceantonio was not involved in the study. "No one is eating just tablespoons of sugar. They are most likely eating that added sugar in highly processed or ultra-processed foods. Those foods may contain other additives, high-levels of fat, or other substances that are linked to poor health outcomes. So, it's not that this one substance, sugar, is causing all these problems, but that this substance is present in a whole host of foods that are contributing to these health issues."
DiFeliceantonio clarified that sugar could be causing some health problems on its own, but noted that it is "more likely a combination of factors."
Dr. Nicole Avena, an assistant professor of neuroscience at Mount Sinai Medical School and a visiting professor of health psychology at Princeton University who studies human health but was also not involved in the study, offered advice on how people who are health-conscious can protect themselves from added sugars.
"I think it's really a good idea to do a food diary," Avena suggested. "Do one or two days where you eat like you typically would and write down every single thing you eat — down to what condiments you're using — and you can really get a clear picture then of how much sugar you're actually consuming. And a lot of people are shocked when they do this because they think they're eating a relatively healthy diet. But when you start to break it down and look at the salad dressings, look at the condiments, even things like nuts that people think of as a healthy snack, but it often has added sugar in there."
DiNicolantonio also urged consumers to consider healthy substitutes for their favorite sweets.
"The best way to beat a sugar craving is to find healthy alternatives that provide a little natural sugar — like berries, an apple, or even a little raw honey or maple syrup," DiNicolantonio opined.
Republican Policies Are Killing Americans: Study
"Changing all policy domains in all states to a fully liberal orientation might have saved 171,030 lives in 2019," researchers estimate, "while changing them to a fully conservative orientation might have cost 217,635 lives."
KENNY STANCIL - COMMON DREAMS
October 26, 2022
The Republican Party's regressive policies are not just unpopular, but a new study out Wednesday suggests they are also deadly to those who live under them.
Working-age mortality rates have been rising for decades across the United States, but premature deaths are more pronounced in states where "conservative" policies predominate and less common in states that have adopted more "liberal" policies, according to peer-reviewed research published in PLOS ONE.
Policies that "expand state power for economic regulation and redistribution, protect the rights of marginalized groups, or restrict state power to punish deviant behavior" were defined by the study's authors as "liberal," while those with opposite aims were deemed "conservative."
For eight policy domains—criminal justice, marijuana, environment, gun safety, health and welfare, private labor, economic taxes, and tobacco taxes—the authors scored state-level measures enacted from 1999 to 2019 on a 0-to-1 continuum, with zero representing the maximum conservative score and one the maximum liberal score.
Using annual data from the National Vital Statistics System, the authors calculated state-level age-adjusted mortality rates during the same time period for deaths from all causes and from cardiovascular disease (CVD), alcohol-induced causes, suicide, and drug poisoning among adults aged 25 to 64.
When they merged the data on working-age mortality with data on state policy contexts, the authors found that liberal policies were associated with fewer early deaths among 25- to 64-year-olds between 1999 and 2019.
"Changing all policy domains in all states to a fully liberal orientation might have saved 171,030 lives in 2019," the researchers estimate, "while changing them to a fully conservative orientation might have cost 217,635 lives."
Study co-author Dr. Steven Woolf, director emeritus of the Center on Society and Health at Virginia Commonwealth University, told USA Today: "As an academic who does scientific research, I studiously avoided talking about politics in my professional work... But the data are pointing us to that as a determinant of health."
Even after controlling for state-specific environmental conditions and demographic characteristics, the authors found that states that invested more in public education and economic security had lower working-age mortality rates than states that gutted workers' rights, environmental regulations, and access to healthcare, including abortion.
"If a state policymaker were to say to me, 'it's unfair to criticize my state because I have a low-educated, low-income population,' I would ask them, 'why do you have a low-educated, low-income population?'" lead study author Jennifer Karas Montez, a professor of sociology at Syracuse University, told USA Today. "It's because of your policy environment."
Demonstrating how state policy contexts influence individual behaviors associated with premature deaths, researchers observed "especially strong associations... between certain domains and specific causes of death: between the gun safety domain and suicide mortality among men, between the labor domain and alcohol-induced mortality, and between both the economic tax and tobacco tax domains and CVD mortality."
Darrell Gaskin, a health economist at Johns Hopkins Bloomberg School of Public Health, said that some people "like to think about (working-age mortality) as failures of individuals, that they eat too much or use drugs, but that's all in context."
"If we don't have the proper regulations in place to protect people, then what happens is that they could be exploited," said Gaskin. "We always get the promise from conservative states that we're going to cut your taxes and regulation and make the environment better for business, and it comes with a cost."
With the midterm elections less than two weeks away, experts say it's important for working-age Americans to know whether they are voting for officials who support right-wing policies that increase the risk of early death or those who favor humane interventions that can help people lead longer and healthier lives.
As Woolf put it, the conservative policies associated with higher working-age mortality revolve around "helping the private sector to thrive in hopes that the economic gains would trickle down to those who need more assistance," while the liberal policies associated with lower working-age mortality focus on improving economic fairness and social and environmental well-being.
With their efforts to impose anti-union "right-to-work" laws, ban abortions, and curtail Medicaid, and their insistence on ignoring gun violence and the life-threatening climate crisis, Republicans have firmly established themselves in the camp that is actively increasing premature deaths among the nation's working-age population.
Although there is a wide range of positions among Democrats that stretch from more progressive to less so, lawmakers in the party are overall much more likely than their GOP counterparts to support life-affirming public goods and services of the sort detailed in the study.
The analysis precedes the ongoing Covid-19 pandemic, which has killed Americans at a significantly higher rate than people in other wealthy countries.
While the nation's deadly for-profit healthcare model, lack of paid sick leave, and other federal policies associated with 40 years of bipartisan neoliberalism have received much blame from progressives, studies show that state-level Republicans' lackadaisical public health measures and the GOP's anti-vaccination propaganda have also exacerbated suffering during the pandemic.
Working-age mortality rates have been rising for decades across the United States, but premature deaths are more pronounced in states where "conservative" policies predominate and less common in states that have adopted more "liberal" policies, according to peer-reviewed research published in PLOS ONE.
Policies that "expand state power for economic regulation and redistribution, protect the rights of marginalized groups, or restrict state power to punish deviant behavior" were defined by the study's authors as "liberal," while those with opposite aims were deemed "conservative."
For eight policy domains—criminal justice, marijuana, environment, gun safety, health and welfare, private labor, economic taxes, and tobacco taxes—the authors scored state-level measures enacted from 1999 to 2019 on a 0-to-1 continuum, with zero representing the maximum conservative score and one the maximum liberal score.
Using annual data from the National Vital Statistics System, the authors calculated state-level age-adjusted mortality rates during the same time period for deaths from all causes and from cardiovascular disease (CVD), alcohol-induced causes, suicide, and drug poisoning among adults aged 25 to 64.
When they merged the data on working-age mortality with data on state policy contexts, the authors found that liberal policies were associated with fewer early deaths among 25- to 64-year-olds between 1999 and 2019.
"Changing all policy domains in all states to a fully liberal orientation might have saved 171,030 lives in 2019," the researchers estimate, "while changing them to a fully conservative orientation might have cost 217,635 lives."
Study co-author Dr. Steven Woolf, director emeritus of the Center on Society and Health at Virginia Commonwealth University, told USA Today: "As an academic who does scientific research, I studiously avoided talking about politics in my professional work... But the data are pointing us to that as a determinant of health."
Even after controlling for state-specific environmental conditions and demographic characteristics, the authors found that states that invested more in public education and economic security had lower working-age mortality rates than states that gutted workers' rights, environmental regulations, and access to healthcare, including abortion.
"If a state policymaker were to say to me, 'it's unfair to criticize my state because I have a low-educated, low-income population,' I would ask them, 'why do you have a low-educated, low-income population?'" lead study author Jennifer Karas Montez, a professor of sociology at Syracuse University, told USA Today. "It's because of your policy environment."
Demonstrating how state policy contexts influence individual behaviors associated with premature deaths, researchers observed "especially strong associations... between certain domains and specific causes of death: between the gun safety domain and suicide mortality among men, between the labor domain and alcohol-induced mortality, and between both the economic tax and tobacco tax domains and CVD mortality."
Darrell Gaskin, a health economist at Johns Hopkins Bloomberg School of Public Health, said that some people "like to think about (working-age mortality) as failures of individuals, that they eat too much or use drugs, but that's all in context."
"If we don't have the proper regulations in place to protect people, then what happens is that they could be exploited," said Gaskin. "We always get the promise from conservative states that we're going to cut your taxes and regulation and make the environment better for business, and it comes with a cost."
With the midterm elections less than two weeks away, experts say it's important for working-age Americans to know whether they are voting for officials who support right-wing policies that increase the risk of early death or those who favor humane interventions that can help people lead longer and healthier lives.
As Woolf put it, the conservative policies associated with higher working-age mortality revolve around "helping the private sector to thrive in hopes that the economic gains would trickle down to those who need more assistance," while the liberal policies associated with lower working-age mortality focus on improving economic fairness and social and environmental well-being.
With their efforts to impose anti-union "right-to-work" laws, ban abortions, and curtail Medicaid, and their insistence on ignoring gun violence and the life-threatening climate crisis, Republicans have firmly established themselves in the camp that is actively increasing premature deaths among the nation's working-age population.
Although there is a wide range of positions among Democrats that stretch from more progressive to less so, lawmakers in the party are overall much more likely than their GOP counterparts to support life-affirming public goods and services of the sort detailed in the study.
The analysis precedes the ongoing Covid-19 pandemic, which has killed Americans at a significantly higher rate than people in other wealthy countries.
While the nation's deadly for-profit healthcare model, lack of paid sick leave, and other federal policies associated with 40 years of bipartisan neoliberalism have received much blame from progressives, studies show that state-level Republicans' lackadaisical public health measures and the GOP's anti-vaccination propaganda have also exacerbated suffering during the pandemic.
The Corporate Narrative on Inflation Is Bogus
Chuck Idelson | THE SMIRKING CHIMP
July 28, 2022
Republicans believe their laser focus on inflation in the midterms will override voter anger over the attempted coup, four years of Trumpism, multiple regressive Supreme Court rulings especially on reproductive rights, school shootings and other mass shootings, and GOP state attacks on voting, public education, and LGBTQ+ rights. But only if we, and the Democratic Party leadership, let them get away with it.
To challenge the GOP and corporate media narrative on inflation, start with exposing the underreported causes, identifying those exploiting rising prices, describing why the conservative fix of jacking up interest rates will cause far worse pain for most working people and families, and outline progressive alternatives.
Conventional reporting on the current inflation mostly blames the pandemic, both the 2021 stimulus package and barely explained supply chain issues. Plus, the Russian invasion of Ukraine, which led to oil cutoffs, another factor in higher gas prices. However, those prices have been dropping for a month, not that Fox News or Republicans running for office will admit that fact.
Supply chain bottlenecks are largely related to an over reliance on a global network aggravated by pandemic factory shutdowns, notably in tech products like semiconductors. Fewer acknowledge what progressive economist Robert Kuttner calls "the 40-year folly" of excessive outsourcing for cheaper labor overseas, and neoliberal policies of deregulating links in the chain that delay the unloading of goods from ports, and rail and trucking transport that exacerbated Covid disruptions.
The right wing, of course, blames Biden and Democrats for the March 2021 pandemic stimulus package that provided U.S. families critical financial assistance following pandemic income losses, as well as workers and unions demanding wage increases.
The anti-recession stimulus was vital to millions of families, especially those who are low-income, single women and workers of color. Moreover, inflation was hardly confined to the United States, as John Oliver noted this week. Britain and Germany have seen the highest inflation rates in four decades, as well as Eastern Europe, Japan, and other countries also experiencing high inflation.
To dig deeper, always look closely at who controls the U.S. economy, Wall Street and big corporations generally through their profiteering practices and their enormous influence over policy makers in Congress and state governments.
Corporate profits, Los Angeles Times columnist Michael Hiltzik points out, "have played a much larger role in fueling inflation than wage increases or the currently low unemployment rate. Wages have crept higher over the last year, but the increases have trailed inflation, which is why so many workers and their families are feeling the sting of higher prices. Corporate profit margins, however, have rocketed into the stratosphere, outpacing the inflation rate and pulling it higher."
Over the last full fiscal year, 53.9 percent of what corporations are charging for their products have gone to profits while only 7.9 percent went to unit labor costs, according to Economic Policy Institute research.
How did the profits get so high? By corporations exploiting the inflationary cycle to jack up prices to massively swell their profit margins to fatten executive compensation packages and shareholder returns.
While the 12-month inflation rate rose 9 percent in July, meats, poultry, fish and eggs prices rose 12 percent, electricity increases hit 14 percent, airline fares soared by 34 percent, and piped gas services exploded by 38 percent.
Overall, "markups and profits skyrocketed in 2021 to their highest recorded level since the 1950s … the fastest annual pace since 1955," wrote Mike Konczal and Niko Lusiani in a June research article for the liberal Roosevelt Institute.
"Here's an inflation remedy you're not hearing much about: Reduce corporate profits," Hiltzik wrote in late June.
That's not the remedy demanded by Wall Street and a long list of corporate and right-wing economists, pundits and thing tanks, and those typically in charge of administrative economic policies and federal agencies like the Federal Reserve. Their solution to inflation, based on decades of conservative political dogma, is always to punish workers and families by lowering their income through higher interest rates intended to prod companies to slash wages and jobs so workers and their families would have far less to spend, even on the most basic needs, such as food, housing, and healthcare. That's what they mean by "reducing consumer demand."
During an inflation spike in the late 1970s, then-Federal Reserve Chair Paul Volker infamously said, "the standard of living of the average American has to decline." Or as conservative economist and later Fed Chair Alan Greenspan intoned in 1997, the key to economic success is "greater worker insecurity."
Reversing the decades of rigid neoliberal policies would not only prioritize the vast majority of American people over corporate wealth, it would also be good politics...
Under grilling from Sen. Elizabeth Warren (D-Mass.) in testimony to the Senate Banking Committee earlier this month, current Fed Chair Jerome Powell who has already raised interest rates once, and is about to sharply raise them again, admitted the interest rate hikes would not bring down gas or food prices. What the rate increases will do, Warren responded, is "make it more likely that companies will fire people and slash hours to shrink wage costs." As Hiltzik frames it, "sacrificing jobs in the cause of reducing inflation is a cure worse than the disease."
Rather than following the punitive script favored by Wall Street and their many acolytes in the media and punditry, there are other ways to attack rising prices and protect people harmed by them. Those include targeting corporate profits, not worker wages, by, for example, a tax on excess profits as both the United Kingdom and Spain have done.
Kuttner suggests, Biden "use the government's antitrust authority to go after big companies that are using their market power to impose unjustified price hikes." The government, he adds, could specifically target the ocean shipping industry which, he notes, "enjoys antitrust immunity that it has used to jack up shipping rates and profits adding to price inflation and the entire range of imports."
Additional steps could include implementing rental assistance to keep people in their homes, restoring the child tax credit that was critical to reducing child poverty in the pandemic stimulus program, and enacting the long overdue bill to let Medicare negotiate drug prices that is apparently finally advancing in Congress. Or even enacting price controls as President Franklin D. Roosevelt did during World War II, and even President Richard Nixon did in the early 1970s.
Reversing the decades of rigid neoliberal policies would not only prioritize the vast majority of American people over corporate wealth, it would also be good politics for a party that has seemed helpless in fighting back against the gains of the right, and encourage the growth of the progressive movement that is central to achieving the transformative social change so essential for our future.
To challenge the GOP and corporate media narrative on inflation, start with exposing the underreported causes, identifying those exploiting rising prices, describing why the conservative fix of jacking up interest rates will cause far worse pain for most working people and families, and outline progressive alternatives.
Conventional reporting on the current inflation mostly blames the pandemic, both the 2021 stimulus package and barely explained supply chain issues. Plus, the Russian invasion of Ukraine, which led to oil cutoffs, another factor in higher gas prices. However, those prices have been dropping for a month, not that Fox News or Republicans running for office will admit that fact.
Supply chain bottlenecks are largely related to an over reliance on a global network aggravated by pandemic factory shutdowns, notably in tech products like semiconductors. Fewer acknowledge what progressive economist Robert Kuttner calls "the 40-year folly" of excessive outsourcing for cheaper labor overseas, and neoliberal policies of deregulating links in the chain that delay the unloading of goods from ports, and rail and trucking transport that exacerbated Covid disruptions.
The right wing, of course, blames Biden and Democrats for the March 2021 pandemic stimulus package that provided U.S. families critical financial assistance following pandemic income losses, as well as workers and unions demanding wage increases.
The anti-recession stimulus was vital to millions of families, especially those who are low-income, single women and workers of color. Moreover, inflation was hardly confined to the United States, as John Oliver noted this week. Britain and Germany have seen the highest inflation rates in four decades, as well as Eastern Europe, Japan, and other countries also experiencing high inflation.
To dig deeper, always look closely at who controls the U.S. economy, Wall Street and big corporations generally through their profiteering practices and their enormous influence over policy makers in Congress and state governments.
Corporate profits, Los Angeles Times columnist Michael Hiltzik points out, "have played a much larger role in fueling inflation than wage increases or the currently low unemployment rate. Wages have crept higher over the last year, but the increases have trailed inflation, which is why so many workers and their families are feeling the sting of higher prices. Corporate profit margins, however, have rocketed into the stratosphere, outpacing the inflation rate and pulling it higher."
Over the last full fiscal year, 53.9 percent of what corporations are charging for their products have gone to profits while only 7.9 percent went to unit labor costs, according to Economic Policy Institute research.
How did the profits get so high? By corporations exploiting the inflationary cycle to jack up prices to massively swell their profit margins to fatten executive compensation packages and shareholder returns.
While the 12-month inflation rate rose 9 percent in July, meats, poultry, fish and eggs prices rose 12 percent, electricity increases hit 14 percent, airline fares soared by 34 percent, and piped gas services exploded by 38 percent.
Overall, "markups and profits skyrocketed in 2021 to their highest recorded level since the 1950s … the fastest annual pace since 1955," wrote Mike Konczal and Niko Lusiani in a June research article for the liberal Roosevelt Institute.
"Here's an inflation remedy you're not hearing much about: Reduce corporate profits," Hiltzik wrote in late June.
That's not the remedy demanded by Wall Street and a long list of corporate and right-wing economists, pundits and thing tanks, and those typically in charge of administrative economic policies and federal agencies like the Federal Reserve. Their solution to inflation, based on decades of conservative political dogma, is always to punish workers and families by lowering their income through higher interest rates intended to prod companies to slash wages and jobs so workers and their families would have far less to spend, even on the most basic needs, such as food, housing, and healthcare. That's what they mean by "reducing consumer demand."
During an inflation spike in the late 1970s, then-Federal Reserve Chair Paul Volker infamously said, "the standard of living of the average American has to decline." Or as conservative economist and later Fed Chair Alan Greenspan intoned in 1997, the key to economic success is "greater worker insecurity."
Reversing the decades of rigid neoliberal policies would not only prioritize the vast majority of American people over corporate wealth, it would also be good politics...
Under grilling from Sen. Elizabeth Warren (D-Mass.) in testimony to the Senate Banking Committee earlier this month, current Fed Chair Jerome Powell who has already raised interest rates once, and is about to sharply raise them again, admitted the interest rate hikes would not bring down gas or food prices. What the rate increases will do, Warren responded, is "make it more likely that companies will fire people and slash hours to shrink wage costs." As Hiltzik frames it, "sacrificing jobs in the cause of reducing inflation is a cure worse than the disease."
Rather than following the punitive script favored by Wall Street and their many acolytes in the media and punditry, there are other ways to attack rising prices and protect people harmed by them. Those include targeting corporate profits, not worker wages, by, for example, a tax on excess profits as both the United Kingdom and Spain have done.
Kuttner suggests, Biden "use the government's antitrust authority to go after big companies that are using their market power to impose unjustified price hikes." The government, he adds, could specifically target the ocean shipping industry which, he notes, "enjoys antitrust immunity that it has used to jack up shipping rates and profits adding to price inflation and the entire range of imports."
Additional steps could include implementing rental assistance to keep people in their homes, restoring the child tax credit that was critical to reducing child poverty in the pandemic stimulus program, and enacting the long overdue bill to let Medicare negotiate drug prices that is apparently finally advancing in Congress. Or even enacting price controls as President Franklin D. Roosevelt did during World War II, and even President Richard Nixon did in the early 1970s.
Reversing the decades of rigid neoliberal policies would not only prioritize the vast majority of American people over corporate wealth, it would also be good politics for a party that has seemed helpless in fighting back against the gains of the right, and encourage the growth of the progressive movement that is central to achieving the transformative social change so essential for our future.
US For-Profit Health System Is a Mass Killer
by Richard Eskow | the smirking chimp
June 19, 2022
Imagine waking up to a headline that reads, "Atlanta Demolished by Nuclear Bomb," and learning that the city's 498,715 residents were dead. The shock to our society would be unimaginable. And yet, we just learned that the American health system killed more people than that in the last two years alone and hardly anyone noticed. The fact that we've also wasted more than a trillion dollars barely merits an afterthought.
The figures are laid out in a new report from the National Academy of Sciences. The goal of the report is to calculate how many lives we could have saved and how much less money we would have spent if a single-payer health system had been in place before the COVID-19 pandemic. Its conclusion? More than 338,000 lives would have been saved between January 2020 and March 2022, and the country would have saved more than $105 billion in hospital expenses.
But that's just part of the story. Even without a pandemic, the authors conclude that we would have experienced 77,675 needless deaths and added costs of $438 billion every year because we've refused to adopt a single-payer system. If we multiply those numbers by 2.25 (for January 2020-March 2022) and add them to the Covid losses, that tells us how much our privatized system has cost us since the pandemic began: 513,363 needless deaths, and $1,007,400,000 in wasted money. (Plus even more since the end of March.)
For a country that claims to hate fiscal irresponsibility, that's sure a lot of wasted money. And for a country that claims to cherish life, that's sure a lot of needless death.
No, wait. "Needless death" is far too genteel a term for what we're doing. I've used the term "negligence" to describe deaths like these in the past, but that's too mild, too. "Human sacrifice" is better.
513,363. That's more than the population of Atlanta. Or Minneapolis, Minnesota. Or Miami, Florida. Or Kansas City, Missouri. Or Omaha, Nebraska. Bakersfield, Tampa, Tulsa. New Orleans, Cleveland, Honolulu, Cincinnati ...
I could go on, but you get the idea.
The million-plus lives we've lost in the pandemic should have convinced us that the life-and-death question of health care is ... well, a life-and-death question. It should also have disabused us of the notion that it is 'moderate' to reject single-payer care and stick with the current, lethal system instead.
That's not 'moderate.' It's murderous.
For the politicians who support the current system, don't worry. I'm sure we can figure out how to retain its most distinguishing features once we've moved to single-payer healthcare. For example, we could nuke a different American city once a year, and send half a billion dollars to United Health, Aetna, Anthem and Cigna at the same time. That would preserve the primary outcomes of the system you're so eager to embrace.
It does leave a thorny question, however. How big should our target cities be? The size of the cities listed above reflects losses during the pandemic. Won't the kill rate go down when the pandemic passes?
The answer to that question depends on whether the pandemic ever passes and whether there will be new disease catastrophes to follow. The way we're handling this one, it's possible we could be in pandemic territory forever. But, fair is fair. Let's go with the more conservative number and target smaller cities.
In non-pandemic years, the US healthcare kill rate is roughly 75-80,000, so we could plan on targeting cities of roughly that size until the next variant arises. The president's home town of Scranton, PA qualifies. So does my home town of Utica, NY. We should all share the sacrifice, so that seems only fair.
What other cities are eligible? Wilmington, Delaware? Check. Duluth, Minnesota? You're up. Flint, Michigan? Oh, wait, we've already sacrificed you. Youngstown, Ohio... Camden, New Jersey... Gary, Indiana... We've already abandoned a lot of these cities economically, so the big corporations will hardly miss them. To the people who run this economy, the people there are already excess human inventory.
Or, here's another thought: We could stop murdering our own population en masse. We could adopt single-payer healthcare and devote ourselves to saving lives and resources, rather than churning profits for Wall Street investors and wealthy executives.
Some people will call that idea radical, but it sounds pretty moderate to me.
The figures are laid out in a new report from the National Academy of Sciences. The goal of the report is to calculate how many lives we could have saved and how much less money we would have spent if a single-payer health system had been in place before the COVID-19 pandemic. Its conclusion? More than 338,000 lives would have been saved between January 2020 and March 2022, and the country would have saved more than $105 billion in hospital expenses.
But that's just part of the story. Even without a pandemic, the authors conclude that we would have experienced 77,675 needless deaths and added costs of $438 billion every year because we've refused to adopt a single-payer system. If we multiply those numbers by 2.25 (for January 2020-March 2022) and add them to the Covid losses, that tells us how much our privatized system has cost us since the pandemic began: 513,363 needless deaths, and $1,007,400,000 in wasted money. (Plus even more since the end of March.)
For a country that claims to hate fiscal irresponsibility, that's sure a lot of wasted money. And for a country that claims to cherish life, that's sure a lot of needless death.
No, wait. "Needless death" is far too genteel a term for what we're doing. I've used the term "negligence" to describe deaths like these in the past, but that's too mild, too. "Human sacrifice" is better.
513,363. That's more than the population of Atlanta. Or Minneapolis, Minnesota. Or Miami, Florida. Or Kansas City, Missouri. Or Omaha, Nebraska. Bakersfield, Tampa, Tulsa. New Orleans, Cleveland, Honolulu, Cincinnati ...
I could go on, but you get the idea.
The million-plus lives we've lost in the pandemic should have convinced us that the life-and-death question of health care is ... well, a life-and-death question. It should also have disabused us of the notion that it is 'moderate' to reject single-payer care and stick with the current, lethal system instead.
That's not 'moderate.' It's murderous.
For the politicians who support the current system, don't worry. I'm sure we can figure out how to retain its most distinguishing features once we've moved to single-payer healthcare. For example, we could nuke a different American city once a year, and send half a billion dollars to United Health, Aetna, Anthem and Cigna at the same time. That would preserve the primary outcomes of the system you're so eager to embrace.
It does leave a thorny question, however. How big should our target cities be? The size of the cities listed above reflects losses during the pandemic. Won't the kill rate go down when the pandemic passes?
The answer to that question depends on whether the pandemic ever passes and whether there will be new disease catastrophes to follow. The way we're handling this one, it's possible we could be in pandemic territory forever. But, fair is fair. Let's go with the more conservative number and target smaller cities.
In non-pandemic years, the US healthcare kill rate is roughly 75-80,000, so we could plan on targeting cities of roughly that size until the next variant arises. The president's home town of Scranton, PA qualifies. So does my home town of Utica, NY. We should all share the sacrifice, so that seems only fair.
What other cities are eligible? Wilmington, Delaware? Check. Duluth, Minnesota? You're up. Flint, Michigan? Oh, wait, we've already sacrificed you. Youngstown, Ohio... Camden, New Jersey... Gary, Indiana... We've already abandoned a lot of these cities economically, so the big corporations will hardly miss them. To the people who run this economy, the people there are already excess human inventory.
Or, here's another thought: We could stop murdering our own population en masse. We could adopt single-payer healthcare and devote ourselves to saving lives and resources, rather than churning profits for Wall Street investors and wealthy executives.
Some people will call that idea radical, but it sounds pretty moderate to me.
Republican Counties Have Higher Mortality Rates Than Democratic Ones, Study Finds
Madeline Halpert, Forbes
6/7/2022
Americans living in counties that voted Republican during presidential elections from 2000 to 2016 experienced higher death rates than those who lived in Democratic counties, according to a new British Medical Journal study, which researchers said added to a growing body of evidence that liberal policies may lead to better health outcomes.
Key Facts
Mortality rates dropped 22% in Democratic counties compared to 11% in Republican counties from 2001 to 2019, according to the study, which used data from the Centers for Disease Control and Prevention linked to county-level presidential election data.
Between 2001 and 2019, the gap in death rates between Democratic and Republican counties also grew by more than 500%, driven largely by deaths from heart disease; cancer; chronic lung disease; unintentional injuries, including drug overdoses; and suicide.
Black and Hispanic residents experienced similar improvements in mortality rates in Democratic and Republican counties, while white residents in Democratic counties saw 15% lower mortality rates than their white counterparts in Republican counties in 2019, compared to a 3% gap in death rates in 2001.
Black Americans saw higher mortality rates in both Democratic and Republican counties compared to white and Hispanic Americans from 2001 to 2019, although Black residents in both Democratic and Republican residents experienced substantial improvements in mortality rates since 2001.
Rural Republican counties experienced the highest mortality rates out of all groups and saw the smallest improvement in death rates over time, suggesting political environments play a crucial role in the widening mortality gap between urban and rural areas, according to the researchers.
What We Don’t Know
Exactly how and what local policies may have affected health outcomes. Researchers studied mortality rates based on whether counties voted for Democratic or Republican presidential candidates, but they did not study specific factors linking political environment and mortality.
Key Background
Previous research has shown counties that elect Republicans tend to see worse health outcomes, including fewer improvements in life expectancy and higher rates of deaths from suicide, drugs and alcohol. The new research comes a day after a study published in Health Affairs showed people living in counties that voted Republican in the 2020 presidential election were more likely to die from Covid-19 than those in counties that voted for Democrats. Researchers have found more liberal policies such as labor, immigration and environmental protections are linked with better life expectancy, while more conservative policies like abortion restrictions and lax gun control laws are associated with lower life expectancy. More liberal states are also more likely to implement welfare policies that act as a safety net for vulnerable populations, such as Medicaid expansion, which has led to better health care and reductions in mortality, while Republican-led states are more likely to have higher rates of uninsured people, according to the researchers. The BMJ study is the first to include data from after the 2016 presidential election as well as a breakdown of mortality rates by sex, race, ethnicity and location. Researchers suggested further studies should focus on the factors causing a growing difference in mortality rates between Republican and Democratic counties.
Key Facts
Mortality rates dropped 22% in Democratic counties compared to 11% in Republican counties from 2001 to 2019, according to the study, which used data from the Centers for Disease Control and Prevention linked to county-level presidential election data.
Between 2001 and 2019, the gap in death rates between Democratic and Republican counties also grew by more than 500%, driven largely by deaths from heart disease; cancer; chronic lung disease; unintentional injuries, including drug overdoses; and suicide.
Black and Hispanic residents experienced similar improvements in mortality rates in Democratic and Republican counties, while white residents in Democratic counties saw 15% lower mortality rates than their white counterparts in Republican counties in 2019, compared to a 3% gap in death rates in 2001.
Black Americans saw higher mortality rates in both Democratic and Republican counties compared to white and Hispanic Americans from 2001 to 2019, although Black residents in both Democratic and Republican residents experienced substantial improvements in mortality rates since 2001.
Rural Republican counties experienced the highest mortality rates out of all groups and saw the smallest improvement in death rates over time, suggesting political environments play a crucial role in the widening mortality gap between urban and rural areas, according to the researchers.
What We Don’t Know
Exactly how and what local policies may have affected health outcomes. Researchers studied mortality rates based on whether counties voted for Democratic or Republican presidential candidates, but they did not study specific factors linking political environment and mortality.
Key Background
Previous research has shown counties that elect Republicans tend to see worse health outcomes, including fewer improvements in life expectancy and higher rates of deaths from suicide, drugs and alcohol. The new research comes a day after a study published in Health Affairs showed people living in counties that voted Republican in the 2020 presidential election were more likely to die from Covid-19 than those in counties that voted for Democrats. Researchers have found more liberal policies such as labor, immigration and environmental protections are linked with better life expectancy, while more conservative policies like abortion restrictions and lax gun control laws are associated with lower life expectancy. More liberal states are also more likely to implement welfare policies that act as a safety net for vulnerable populations, such as Medicaid expansion, which has led to better health care and reductions in mortality, while Republican-led states are more likely to have higher rates of uninsured people, according to the researchers. The BMJ study is the first to include data from after the 2016 presidential election as well as a breakdown of mortality rates by sex, race, ethnicity and location. Researchers suggested further studies should focus on the factors causing a growing difference in mortality rates between Republican and Democratic counties.
BOTTLED WATER GIANT BLUETRITON ADMITS CLAIMS OF RECYCLING AND SUSTAINABILITY ARE “PUFFERY”
BlueTriton, owner of Poland Spring and other brands of water packaged in plastic, stated in a court filing that its claims of sustainability are “vague and hyperbolic.”
Sharon Lerner - the intercept
April 26 2022
IN ONGOING LITIGATION over the greenwashing of plastic recycling, the bottled water company BlueTriton made a revealing argument: its claims of being environmentally friendly aren’t violations of the law, because they are “aspirational.”(i.e. BULLSHIT)
BlueTriton — which owns Poland Spring, Pure Life, Splash, Ozarka, and Arrowhead, among many other brands — is estimated to contribute hundreds of millions of pounds of plastic to U.S. landfills each year. BlueTriton used to be known as Nestlé Waters North America, which was bought by the private equity firm One Rock Capital Partners in March 2021. The company, which has a history of draining aquifers to get the water that it encases in polluting plastic, owns about a third of bottled water brands in the U.S. Yet with sleek, green — and blue — PR materials, BlueTriton markets itself as a solution to the problems of plastic waste and water.
“Water is at the very core of our sustainable efforts to meet the needs of future generations,” BlueTriton declares on its website, spelling out its promise for sustainable stewardship over a picture of pine trees, pristine water, and clouds. The company’s Instagram account is similarly nature-oriented and wholesome, filled with green-tinged images of people hiking and enhancing the native trout population.
The claims were a bridge too far for the environmental group Earth Island Institute, which sued BlueTriton in August, arguing that its misleading sustainability claims violate a local Washington, D.C., law known as the Consumer Protection Procedures Act, which is designed to prevent “deceptive trade practices.” In response, the company defended its green self-promotion by explaining that everyone should realize that the claims are meaningless nonsense.
“Many of the statements at issue here constitute non-actionable puffery,” BlueTriton’s attorneys wrote in a motion to dismiss the case submitted to a D.C. court in March. “BlueTriton’s representation of itself as ‘a guardian of sustainable resources’ and ‘a company who, at its core, cares about water’ is vague and hyperbolic,” the attorneys continued. “Because these statements are ‘couched in aspirational terms,’ they cannot serve as the basis for Plaintiff’s CPPA claim.”
Dirty Business
When BlueTriton picked a new logo in April 2021, it explained its choice on Instagram as a nod to its commitment to nature and environmentalism. “Triton is a god of the sea in classical Greek mythology,” the company wrote. “Combined with the color blue, representing water, the new name and logo reflect our role as a guardian of sustainable resources and a provider of fresh water.”
Several of its brands go even further, suggesting that they are helping address the plastic problem because the bottles can in principle be recycled. BlueTriton brands Poland Spring, Ozarka, and Zephyrhills Water advertise that “We use #1PET plastic, which can be used over and over again!” Pure Life water boasts that all its bottles are “100% recyclable … and can be used for new bottles and all sorts of new, reusable things.” Deer Park claims that its recyclable bottles help “keep plastic out of landfills” and that the company “care[s] about you & our planet.”
In truth, there is overwhelming evidence that recycling cannot solve the plastic problem. Since the 1950s, only 9 percent of plastic produced has been recycled, while the vast majority of plastic waste is either landfilled or incinerated. Six times more plastic waste is burned than recycled in the United States. Packaging, including the PET bottles that BlueTriton brands describe as recyclable, account for more than half the plastic that winds up in landfills.
As the complaint notes, plastic pollution is now so widespread that the average person is drinking more than 1,700 tiny bits of plastic in a week’s worth of drinking water — the equivalent of an entire credit card. Microplastics are found in 94.4 percent of tap water samples in the U.S. and may be an even bigger problem in bottled water, despite bottled water companies marketing their product as pollution-free. One BlueTriton brand, Pure Life, had twice the level of plastic fibers as tap water.
Meanwhile, as BlueTriton touts itself as a solution to America’s water problems, it has been caught extracting water from the national forest without authorization. The practice of tapping into natural water supplies has been shown to drain aquifers and rivers, taking water from plants and animals as well as public drinking water reserves.
Empty Promises
With rising public awareness of the role played by bottled water companies in the plastic pollution crisis, companies have publicly pledged to do better. In 2008, Nestlé Waters North America committed to recycling 60 percent of PET bottles by 2018. The company proudly announced its intentions in its first corporate citizenship report (which is no longer available online). But when the deadline came and its recycling rate was still less than half of its goal — just 28.9 percent, according to a 2020 report by the Changing Markets Foundation — the company just issued another pledge rather than dwelling on its failure to meet the earlier one.
The loud announcement of lofty goals for plastic recycling followed by the quiet failure to meet them is part of a larger pattern. Since at least 1990, Coca-Cola has made repeated promises on the plastics front, including commitments to use more recycled plastic, recover and refill more of its bottles, and incorporate more plant-based materials. The company, which has fought against efforts that would reduce plastic waste and recently hired Bill Nye to help clean up its image, regularly rolls out these goals with much fanfare and rarely, if ever, meets them. Coca-Cola did not respond to an inquiry for this story.
The distances between PR and reality are particularly pronounced around pledges to increasingly rely on recycled plastic, which is far more expensive to use than new plastic. According to Beyond Plastics, 10 major corporations — including L’Oréal, Unilever, Nestlé, and PepsiCo — had promised vast reductions in their dependence on virgin plastic while continuing to rely on new plastic. The environmental advocacy organization based its findings on 2019 data, the most recent available.
BlueTriton, which does not publicly list a media contact and provides no way for reporters to ask questions, did not respond to an inquiry from The Intercept for this article (which was conveyed through a message left with the sales department). But in its filing that asks the court to dismiss the greenwashing suit, the company argues that some of its brands have taken several steps that show they are genuinely sustainable. It says that Pure Life, for instance, has converted the cooling towers in its bottling plants to reuse water that was previously discharged. And that company is also “reduc[ing] the amount of plastic in our 0.5 liter bottles by over 40%” and “improving our production processes to reduce the amount of water needed to make one liter of Pure Life® purified water.” One Rock Capital Partners, the private equity firm that bought Nestlé Waters North America, also did not respond to an inquiry from The Intercept.
Sumona Majumdar, general counsel at the Earth Island Institute, dismissed those claims. “You can’t claim to be a sustainable company while using plastic as your primary packaging,” said Majumdar. “Maybe there was a time when, as a company, you might have thought our plastic is getting recycled and getting turned back into plastic. But at this point, everybody knows that’s not true.”
Majumdar counts the company’s executives among those who clearly understand that they are contributing to the plastic waste crisis — even as their spin suggests otherwise.
“When you look at their Instagram feeds and their statements about sustainability, it seems like a fait accompli. But in this brief they filed, they’re admitting that they use these sustainability commitments just as marketing tools,” said Majumdar. “It’s just to get consumers to buy their goods, and not because they actually intend to follow through with their promises.”
BlueTriton — which owns Poland Spring, Pure Life, Splash, Ozarka, and Arrowhead, among many other brands — is estimated to contribute hundreds of millions of pounds of plastic to U.S. landfills each year. BlueTriton used to be known as Nestlé Waters North America, which was bought by the private equity firm One Rock Capital Partners in March 2021. The company, which has a history of draining aquifers to get the water that it encases in polluting plastic, owns about a third of bottled water brands in the U.S. Yet with sleek, green — and blue — PR materials, BlueTriton markets itself as a solution to the problems of plastic waste and water.
“Water is at the very core of our sustainable efforts to meet the needs of future generations,” BlueTriton declares on its website, spelling out its promise for sustainable stewardship over a picture of pine trees, pristine water, and clouds. The company’s Instagram account is similarly nature-oriented and wholesome, filled with green-tinged images of people hiking and enhancing the native trout population.
The claims were a bridge too far for the environmental group Earth Island Institute, which sued BlueTriton in August, arguing that its misleading sustainability claims violate a local Washington, D.C., law known as the Consumer Protection Procedures Act, which is designed to prevent “deceptive trade practices.” In response, the company defended its green self-promotion by explaining that everyone should realize that the claims are meaningless nonsense.
“Many of the statements at issue here constitute non-actionable puffery,” BlueTriton’s attorneys wrote in a motion to dismiss the case submitted to a D.C. court in March. “BlueTriton’s representation of itself as ‘a guardian of sustainable resources’ and ‘a company who, at its core, cares about water’ is vague and hyperbolic,” the attorneys continued. “Because these statements are ‘couched in aspirational terms,’ they cannot serve as the basis for Plaintiff’s CPPA claim.”
Dirty Business
When BlueTriton picked a new logo in April 2021, it explained its choice on Instagram as a nod to its commitment to nature and environmentalism. “Triton is a god of the sea in classical Greek mythology,” the company wrote. “Combined with the color blue, representing water, the new name and logo reflect our role as a guardian of sustainable resources and a provider of fresh water.”
Several of its brands go even further, suggesting that they are helping address the plastic problem because the bottles can in principle be recycled. BlueTriton brands Poland Spring, Ozarka, and Zephyrhills Water advertise that “We use #1PET plastic, which can be used over and over again!” Pure Life water boasts that all its bottles are “100% recyclable … and can be used for new bottles and all sorts of new, reusable things.” Deer Park claims that its recyclable bottles help “keep plastic out of landfills” and that the company “care[s] about you & our planet.”
In truth, there is overwhelming evidence that recycling cannot solve the plastic problem. Since the 1950s, only 9 percent of plastic produced has been recycled, while the vast majority of plastic waste is either landfilled or incinerated. Six times more plastic waste is burned than recycled in the United States. Packaging, including the PET bottles that BlueTriton brands describe as recyclable, account for more than half the plastic that winds up in landfills.
As the complaint notes, plastic pollution is now so widespread that the average person is drinking more than 1,700 tiny bits of plastic in a week’s worth of drinking water — the equivalent of an entire credit card. Microplastics are found in 94.4 percent of tap water samples in the U.S. and may be an even bigger problem in bottled water, despite bottled water companies marketing their product as pollution-free. One BlueTriton brand, Pure Life, had twice the level of plastic fibers as tap water.
Meanwhile, as BlueTriton touts itself as a solution to America’s water problems, it has been caught extracting water from the national forest without authorization. The practice of tapping into natural water supplies has been shown to drain aquifers and rivers, taking water from plants and animals as well as public drinking water reserves.
Empty Promises
With rising public awareness of the role played by bottled water companies in the plastic pollution crisis, companies have publicly pledged to do better. In 2008, Nestlé Waters North America committed to recycling 60 percent of PET bottles by 2018. The company proudly announced its intentions in its first corporate citizenship report (which is no longer available online). But when the deadline came and its recycling rate was still less than half of its goal — just 28.9 percent, according to a 2020 report by the Changing Markets Foundation — the company just issued another pledge rather than dwelling on its failure to meet the earlier one.
The loud announcement of lofty goals for plastic recycling followed by the quiet failure to meet them is part of a larger pattern. Since at least 1990, Coca-Cola has made repeated promises on the plastics front, including commitments to use more recycled plastic, recover and refill more of its bottles, and incorporate more plant-based materials. The company, which has fought against efforts that would reduce plastic waste and recently hired Bill Nye to help clean up its image, regularly rolls out these goals with much fanfare and rarely, if ever, meets them. Coca-Cola did not respond to an inquiry for this story.
The distances between PR and reality are particularly pronounced around pledges to increasingly rely on recycled plastic, which is far more expensive to use than new plastic. According to Beyond Plastics, 10 major corporations — including L’Oréal, Unilever, Nestlé, and PepsiCo — had promised vast reductions in their dependence on virgin plastic while continuing to rely on new plastic. The environmental advocacy organization based its findings on 2019 data, the most recent available.
BlueTriton, which does not publicly list a media contact and provides no way for reporters to ask questions, did not respond to an inquiry from The Intercept for this article (which was conveyed through a message left with the sales department). But in its filing that asks the court to dismiss the greenwashing suit, the company argues that some of its brands have taken several steps that show they are genuinely sustainable. It says that Pure Life, for instance, has converted the cooling towers in its bottling plants to reuse water that was previously discharged. And that company is also “reduc[ing] the amount of plastic in our 0.5 liter bottles by over 40%” and “improving our production processes to reduce the amount of water needed to make one liter of Pure Life® purified water.” One Rock Capital Partners, the private equity firm that bought Nestlé Waters North America, also did not respond to an inquiry from The Intercept.
Sumona Majumdar, general counsel at the Earth Island Institute, dismissed those claims. “You can’t claim to be a sustainable company while using plastic as your primary packaging,” said Majumdar. “Maybe there was a time when, as a company, you might have thought our plastic is getting recycled and getting turned back into plastic. But at this point, everybody knows that’s not true.”
Majumdar counts the company’s executives among those who clearly understand that they are contributing to the plastic waste crisis — even as their spin suggests otherwise.
“When you look at their Instagram feeds and their statements about sustainability, it seems like a fait accompli. But in this brief they filed, they’re admitting that they use these sustainability commitments just as marketing tools,” said Majumdar. “It’s just to get consumers to buy their goods, and not because they actually intend to follow through with their promises.”
“If You’re Getting a W-2, You’re a Sucker”
There are many differences between the rich and the rest of us, but one of the most consequential for your taxes is whether most of your income comes from wages.
by Paul Kiel - the intercept
April 15, 5 a.m. EDT
...The financial reality of the ultrawealthy is not so easily defined. For one, wages make up only a small part of their earnings. And they have broad latitude in how they account for their businesses and investments. Their incomes aren’t defined by a tax form. Instead, they represent the triumph of careful planning by skilled professionals who strive to deliver the most-advantageous-yet-still-plausible answers to their clients. For them, a tax return is an opening bid to the IRS. It’s a kind of theory.
In that tax world, nearly anything is possible. Stephen Ross is one of the world’s most successful real estate developers, a billionaire many times over, the owner of the Miami Dolphins. Ross, a former tax lawyer, once praised tax law as a particularly “creative” endeavor, and he is a master of the craft. His tax returns showed a total of $1.5 billion in earnings from 2008 to 2017, but he didn’t pay a dime in federal income taxes during that time. How? By mining a mountain of losses he claimed for tax purposes, as ProPublica reported. Look at Ross’s “income” for any of those years, and you’ll see numbers as low as negative $447 million. (He told ProPublica he abides by the tax laws.)
Texas billionaire Kelcy Warren owns a massively profitable natural gas pipeline company. But in an orgy of cake eating and having, he’s able to receive hundreds of millions of dollars from his company tax-free while reporting vast losses to the IRS thanks to energy-industry and other tax breaks, his records showed. (Warren did not respond to our questions.)
Based on those reported “incomes,” both Ross and Warren received COVID stimulus checks in 2020. We counted at least 16 other billionaires (along with hundreds of other ultrawealthy people, including hedge fund managers and former CEOs) among the stimulus check recipients. This is just how our system works. It’s why, in 2011, Jeff Bezos, then worth $18 billion, qualified for $4,000 in refundable child tax credits. (Bezos didn’t respond to our questions.)
A recent study by the Brookings Institution set out with a simple aim: to compare what owners of privately held businesses say they earn with the income that appears on the owners’ tax returns. The findings were stark: “More than half of economic income generated by closely held businesses does not appear on tax returns and that ratio has declined significantly over the past 25 years.”
That doesn’t mean business owners are illegally hiding income from the IRS, though it’s certainly a possible contributor. There are plenty of ways to make income vanish legally. Tax perks like depreciation allow owners to create tax losses even as they expand their businesses, and real estate developers like Ross can claim losses even on appreciating properties. “Losses” from one business can also be used to wipe out income from another. Sometimes spilling red ink can be lots of fun: For billionaires, owning sports teams and thoroughbred racehorses are exciting loss-makers.
Congress larded the tax code with these sorts of provisions on the logic that what’s good for businesses is good for the economy. Often, the evidence for this broader effect is thin or nonexistent, but you can be sure all this is great for business owners. The Brookings study found that households worth $10 million or more benefited the most from being able to make income disappear.
This isn’t just about a divide between rich and poor. Take two people, each earning $1 million, one through salary, the other through their business. Though they may live in the same neighborhood and send their kids to the same private school, they do not share the same tax world.
Under the current system, said John Sabelhaus, a former Federal Reserve economist and one of the study’s authors, “if you’re getting a W-2, you’re a sucker.”
This basic divide is also apparent in how tax laws are enforced. To the IRS, the average worker is an open book, since all their income is disclosed on those W-2s and 1099s. Should they enter an errant number on their tax return, a computer at the agency can easily catch it.
But that’s generally not true for private businesses. Such companies are often tangles of interrelated partnerships that, like densely grown forest, can be hard to penetrate. Auditing businesses like these “certainly is a test of endurance,” said Spretnak, the former IRS agent.
If she managed to solve the puzzle of how income flowed from one entity to another, she moved on to a stiffer challenge. It didn’t matter if what she saw made her jaw drop. She had to prove that the business’s tax geniuses had exceeded even what the generous tax laws allowed them to do. Often, she found, they had. Making her findings stick against a determined and well-funded opponent was her final hurdle.
By the time Spretnak retired in 2018, the IRS had gone from merely budget-constrained to budget-starved. Thousands of skilled auditors like her have left, not to be replaced. Audits of the wealthy have plummeted. Business owners have still more reason to be bold.
On the other side of the chasm from the W-2er, there’s still another tax world, one that’s even more foreign than that of business income. It’s the paradise of unrealized gains, a place particularly enjoyed by the major shareholders of public companies.
If your company’s stock shoots up and you grow $1 billion richer, that increase in wealth is real. Banks will gladly lend to you with such ample collateral, and magazines will put you on their covers. But if you simply avoid selling your appreciated assets (that is, realizing your gains), you haven’t generated income and therefore owe no tax.
Economists have long argued that to exclude such unrealized gains from the definition of income is to draw an arbitrary line. The Supreme Court, as far back as 1940, agreed, calling the general rule of not taxing unrealized gains an “administrative convenience.”
From 2014 to 2018, the 25 wealthiest Americans grew about $400 billion richer, according to Forbes. To an economist, this was income, but under tax law, it was mere vapor, irrelevant. And so this group, including the likes of Bezos, Elon Musk and Warren Buffett, paid federal income taxes of about 3.4% on the $400 billion, ProPublica reported. We called this the group’s “True Tax Rate.”
Recently, the Biden administration took a major step toward the “True Tax Rate” way of seeing things. It proposed a Billionaire Minimum Income Tax for the ultrawealthy that would treat unrealized gains as income and tax them at 20%.
To say that the idea’s fate in the Senate is uncertain would probably be overstating its chances. It is nevertheless a landmark proposal. Instead of the usual talk of raising income tax rates on the rich, the Biden proposal advocates a fundamental rethinking.
In the tax system we have, billionaires who’d really rather not pay income taxes can usually find a way not to. They can bank their accumulating gains tax-free and deploy tax losses to wipe out whatever taxable income they might have. They can even look forward to a few thousand dollars here and there from the government to help them raise their kids or get through a national emergency.
You can think of efforts to change this system as a battle between the rich and everybody else. And sure, it is. But it’s also an effort to pull those other tax worlds down to the terra firma of the wage earner, to make it so a W-2 isn’t the mark of a sucker.
In that tax world, nearly anything is possible. Stephen Ross is one of the world’s most successful real estate developers, a billionaire many times over, the owner of the Miami Dolphins. Ross, a former tax lawyer, once praised tax law as a particularly “creative” endeavor, and he is a master of the craft. His tax returns showed a total of $1.5 billion in earnings from 2008 to 2017, but he didn’t pay a dime in federal income taxes during that time. How? By mining a mountain of losses he claimed for tax purposes, as ProPublica reported. Look at Ross’s “income” for any of those years, and you’ll see numbers as low as negative $447 million. (He told ProPublica he abides by the tax laws.)
Texas billionaire Kelcy Warren owns a massively profitable natural gas pipeline company. But in an orgy of cake eating and having, he’s able to receive hundreds of millions of dollars from his company tax-free while reporting vast losses to the IRS thanks to energy-industry and other tax breaks, his records showed. (Warren did not respond to our questions.)
Based on those reported “incomes,” both Ross and Warren received COVID stimulus checks in 2020. We counted at least 16 other billionaires (along with hundreds of other ultrawealthy people, including hedge fund managers and former CEOs) among the stimulus check recipients. This is just how our system works. It’s why, in 2011, Jeff Bezos, then worth $18 billion, qualified for $4,000 in refundable child tax credits. (Bezos didn’t respond to our questions.)
A recent study by the Brookings Institution set out with a simple aim: to compare what owners of privately held businesses say they earn with the income that appears on the owners’ tax returns. The findings were stark: “More than half of economic income generated by closely held businesses does not appear on tax returns and that ratio has declined significantly over the past 25 years.”
That doesn’t mean business owners are illegally hiding income from the IRS, though it’s certainly a possible contributor. There are plenty of ways to make income vanish legally. Tax perks like depreciation allow owners to create tax losses even as they expand their businesses, and real estate developers like Ross can claim losses even on appreciating properties. “Losses” from one business can also be used to wipe out income from another. Sometimes spilling red ink can be lots of fun: For billionaires, owning sports teams and thoroughbred racehorses are exciting loss-makers.
Congress larded the tax code with these sorts of provisions on the logic that what’s good for businesses is good for the economy. Often, the evidence for this broader effect is thin or nonexistent, but you can be sure all this is great for business owners. The Brookings study found that households worth $10 million or more benefited the most from being able to make income disappear.
This isn’t just about a divide between rich and poor. Take two people, each earning $1 million, one through salary, the other through their business. Though they may live in the same neighborhood and send their kids to the same private school, they do not share the same tax world.
Under the current system, said John Sabelhaus, a former Federal Reserve economist and one of the study’s authors, “if you’re getting a W-2, you’re a sucker.”
This basic divide is also apparent in how tax laws are enforced. To the IRS, the average worker is an open book, since all their income is disclosed on those W-2s and 1099s. Should they enter an errant number on their tax return, a computer at the agency can easily catch it.
But that’s generally not true for private businesses. Such companies are often tangles of interrelated partnerships that, like densely grown forest, can be hard to penetrate. Auditing businesses like these “certainly is a test of endurance,” said Spretnak, the former IRS agent.
If she managed to solve the puzzle of how income flowed from one entity to another, she moved on to a stiffer challenge. It didn’t matter if what she saw made her jaw drop. She had to prove that the business’s tax geniuses had exceeded even what the generous tax laws allowed them to do. Often, she found, they had. Making her findings stick against a determined and well-funded opponent was her final hurdle.
By the time Spretnak retired in 2018, the IRS had gone from merely budget-constrained to budget-starved. Thousands of skilled auditors like her have left, not to be replaced. Audits of the wealthy have plummeted. Business owners have still more reason to be bold.
On the other side of the chasm from the W-2er, there’s still another tax world, one that’s even more foreign than that of business income. It’s the paradise of unrealized gains, a place particularly enjoyed by the major shareholders of public companies.
If your company’s stock shoots up and you grow $1 billion richer, that increase in wealth is real. Banks will gladly lend to you with such ample collateral, and magazines will put you on their covers. But if you simply avoid selling your appreciated assets (that is, realizing your gains), you haven’t generated income and therefore owe no tax.
Economists have long argued that to exclude such unrealized gains from the definition of income is to draw an arbitrary line. The Supreme Court, as far back as 1940, agreed, calling the general rule of not taxing unrealized gains an “administrative convenience.”
From 2014 to 2018, the 25 wealthiest Americans grew about $400 billion richer, according to Forbes. To an economist, this was income, but under tax law, it was mere vapor, irrelevant. And so this group, including the likes of Bezos, Elon Musk and Warren Buffett, paid federal income taxes of about 3.4% on the $400 billion, ProPublica reported. We called this the group’s “True Tax Rate.”
Recently, the Biden administration took a major step toward the “True Tax Rate” way of seeing things. It proposed a Billionaire Minimum Income Tax for the ultrawealthy that would treat unrealized gains as income and tax them at 20%.
To say that the idea’s fate in the Senate is uncertain would probably be overstating its chances. It is nevertheless a landmark proposal. Instead of the usual talk of raising income tax rates on the rich, the Biden proposal advocates a fundamental rethinking.
In the tax system we have, billionaires who’d really rather not pay income taxes can usually find a way not to. They can bank their accumulating gains tax-free and deploy tax losses to wipe out whatever taxable income they might have. They can even look forward to a few thousand dollars here and there from the government to help them raise their kids or get through a national emergency.
You can think of efforts to change this system as a battle between the rich and everybody else. And sure, it is. But it’s also an effort to pull those other tax worlds down to the terra firma of the wage earner, to make it so a W-2 isn’t the mark of a sucker.
Poorest US Counties Suffered Twice the COVID Deaths of the Richest
BY Jake Johnson, Common Dreams
PUBLISHED April 4, 2022
A first-of-its-kind examination of the coronavirus pandemic’s impact on low-income communities published Monday shows that Covid-19 has been twice as deadly in poor counties as in wealthy ones, a finding seen as a damning indictment of the U.S. government’s pandemic response.
“The neglect of poor and low-wealth people in this country during a pandemic is immoral, shocking, and unjust, especially in light of the trillions of dollars that profit-driven entities received,” said Rev. Dr. William Barber II, co-chair of the national Poor People’s Campaign, which conducted the new analysis alongside a team of economists and other experts.
Released on the 54th anniversary of Dr. Martin Luther King Jr.’s murder in Memphis, Tennessee — where he was fighting for the rights and dignity of low-wage sanitation workers — the new report aims to bring to the forefront the relationship between poverty, income, and occupation and Covid-19 mortality.
The extent to which class is a predictor of coronavirus vulnerability is understudied, according to Barber, who noted that “Covid-19 data collection does not include data on poverty, income, or occupation, alongside race and pandemic outcomes.”
“The Poor People’s Pandemic Digital Report and Intersectional Analysis addresses this knowledge gap,” said Barber, “and exposes the unnecessary deaths by mapping community characteristics and connecting them with Covid-19 outcomes.”
Assessing figures from more than 3,000 U.S. counties, the researchers estimated that the poorest counties have suffered twice as many coronavirus-related deaths as the wealthiest. In the most fatal waves of the coronavirus pandemic — the spike in the winter of 2020-2021 and the Omicron surge — the poorest counties suffered 4.5 times more deaths than the wealthiest.
“This cannot be explained by vaccination status,” Shailly Gupta Barnes, policy director for the Poor People’s Campaign, said in a statement. “Over half of the population in [the poorest] counties have received their second vaccine shot, but uninsured rates are twice as high.”
The analysis features an interactive map that ranks counties based on the intersection between poverty rates — specifically, the percentage of people living below 200% of the official poverty line — and coronavirus death rates.
The county highest on the list is Galax, Virginia, where nearly 50% of the population lives below 200% of the poverty line. The county has a coronavirus death rate of 1,134 per 100,000 people, far higher than the national rate of 299 per 100,000.
Next on the list is Hancock, Georgia, which has a Covid-19 death rate of 1,029 per 100,000 people. More than 52% of the county’s population lives below 200% of the poverty line.
The counties with the highest coronavirus death rates, according to the new report, had one-and-a-half times higher poverty rates than counties with lower death rates.
Dr. Jeffrey Sachs, president of the U.N. Sustainable Development Solutions Network and one of the experts behind the study, said the findings make clear that the pandemic is “not only a national tragedy, but also a failure of social justice.”
“The burden of disease — in terms of deaths, illness, and economic costs — was borne disproportionately by the poor, women, and people of color,” said Sachs. “The poor were America’s essential workers, on the front lines, saving lives and also incurring disease and death.”
The researchers who conducted the analysis are expected to amplify their findings and discuss their implications during a press conference in Washington, D.C. at 10:00 am ET. The press conference will also feature people from some of the poorest, hardest-hit counties examined in the report.
The analysis was released as the U.S. moves closer to the grim milestone of 1 million coronavirus deaths, an estimated toll that’s widely seen as an undercount.
Rev. Dr. Liz Theoharis, national co-chair of the Poor People’s Campaign, said in a statement Monday that “the Covid-19 disparities among counties across the U.S. are striking.”
“This report shows clearly that Covid-19 became a ‘poor people’s pandemic,'” said Theoharis. “We can no longer ignore the reality of poverty and dismiss its root causes as the problems of individual people or communities. There has been a systemic failure to address poverty in this country and poor communities have borne the consequences not only in this pandemic, but for years and generations before.”
“However, this does not need to continue,” she added. “Our nation has the resources to fully address poverty and low wealth from the bottom up.”
“The neglect of poor and low-wealth people in this country during a pandemic is immoral, shocking, and unjust, especially in light of the trillions of dollars that profit-driven entities received,” said Rev. Dr. William Barber II, co-chair of the national Poor People’s Campaign, which conducted the new analysis alongside a team of economists and other experts.
Released on the 54th anniversary of Dr. Martin Luther King Jr.’s murder in Memphis, Tennessee — where he was fighting for the rights and dignity of low-wage sanitation workers — the new report aims to bring to the forefront the relationship between poverty, income, and occupation and Covid-19 mortality.
The extent to which class is a predictor of coronavirus vulnerability is understudied, according to Barber, who noted that “Covid-19 data collection does not include data on poverty, income, or occupation, alongside race and pandemic outcomes.”
“The Poor People’s Pandemic Digital Report and Intersectional Analysis addresses this knowledge gap,” said Barber, “and exposes the unnecessary deaths by mapping community characteristics and connecting them with Covid-19 outcomes.”
Assessing figures from more than 3,000 U.S. counties, the researchers estimated that the poorest counties have suffered twice as many coronavirus-related deaths as the wealthiest. In the most fatal waves of the coronavirus pandemic — the spike in the winter of 2020-2021 and the Omicron surge — the poorest counties suffered 4.5 times more deaths than the wealthiest.
“This cannot be explained by vaccination status,” Shailly Gupta Barnes, policy director for the Poor People’s Campaign, said in a statement. “Over half of the population in [the poorest] counties have received their second vaccine shot, but uninsured rates are twice as high.”
The analysis features an interactive map that ranks counties based on the intersection between poverty rates — specifically, the percentage of people living below 200% of the official poverty line — and coronavirus death rates.
The county highest on the list is Galax, Virginia, where nearly 50% of the population lives below 200% of the poverty line. The county has a coronavirus death rate of 1,134 per 100,000 people, far higher than the national rate of 299 per 100,000.
Next on the list is Hancock, Georgia, which has a Covid-19 death rate of 1,029 per 100,000 people. More than 52% of the county’s population lives below 200% of the poverty line.
The counties with the highest coronavirus death rates, according to the new report, had one-and-a-half times higher poverty rates than counties with lower death rates.
Dr. Jeffrey Sachs, president of the U.N. Sustainable Development Solutions Network and one of the experts behind the study, said the findings make clear that the pandemic is “not only a national tragedy, but also a failure of social justice.”
“The burden of disease — in terms of deaths, illness, and economic costs — was borne disproportionately by the poor, women, and people of color,” said Sachs. “The poor were America’s essential workers, on the front lines, saving lives and also incurring disease and death.”
The researchers who conducted the analysis are expected to amplify their findings and discuss their implications during a press conference in Washington, D.C. at 10:00 am ET. The press conference will also feature people from some of the poorest, hardest-hit counties examined in the report.
The analysis was released as the U.S. moves closer to the grim milestone of 1 million coronavirus deaths, an estimated toll that’s widely seen as an undercount.
Rev. Dr. Liz Theoharis, national co-chair of the Poor People’s Campaign, said in a statement Monday that “the Covid-19 disparities among counties across the U.S. are striking.”
“This report shows clearly that Covid-19 became a ‘poor people’s pandemic,'” said Theoharis. “We can no longer ignore the reality of poverty and dismiss its root causes as the problems of individual people or communities. There has been a systemic failure to address poverty in this country and poor communities have borne the consequences not only in this pandemic, but for years and generations before.”
“However, this does not need to continue,” she added. “Our nation has the resources to fully address poverty and low wealth from the bottom up.”
Life expectancy lowest in red states -- and the problem is getting worse
Travis Gettys - raw story
March 17, 2022
Life expectancy is lower in Republican-led states, and the problem has been growing worse for decades.
Health disparities became worse in the 1990s, according to a study in the Journal of the American Medical Association, and researchers blame conservative governors who override public safety measures such as indoor smoking bans, nutrition regulations, firearm restrictions and COVID-19 mitigation, reported the Washington Post.
"It should come as no surprise that the highest rates for COVID-19 deaths and murders are found mainly in red states," wrote columnist Jennifer Rubin. "A political mind-set that prioritizes racial resentment, anti-science zealotry and manufactured cultural wedge issues is not likely to be conducive to long, healthy lives. Indeed, antagonism toward 'elites' (e.g., experts) often impedes common-sense measures that save lives."
Eight of the 10 highest COVID-19 death rates adjusted for age have GOP governors, as do nine of the 10 states with the worst vaccination rates, and a recent report found murder rates were 40 percent higher per capita in states won by Donald Trump, and eight of the 10 states with the highest murder rates in 2020 backed the Republican presidential nominee this century.
"Whatever the specific reason, it’s clear the governing philosophy of right-wing states (e.g., low spending; prioritization of cultural wedge issues; anti-elitism) leads to deadly results," Rubin wrote. "Maybe it’s time they stop spending their political energy persecuting gay kids, banning books, outlawing abortion and fanning culture wars. They have plenty of systemic problems they’ve failed to address while busying themselves with MAGA crusades."
"Red-state voters should look around and see why their states have fallen so far behind in so many categories," she added.
Health disparities became worse in the 1990s, according to a study in the Journal of the American Medical Association, and researchers blame conservative governors who override public safety measures such as indoor smoking bans, nutrition regulations, firearm restrictions and COVID-19 mitigation, reported the Washington Post.
"It should come as no surprise that the highest rates for COVID-19 deaths and murders are found mainly in red states," wrote columnist Jennifer Rubin. "A political mind-set that prioritizes racial resentment, anti-science zealotry and manufactured cultural wedge issues is not likely to be conducive to long, healthy lives. Indeed, antagonism toward 'elites' (e.g., experts) often impedes common-sense measures that save lives."
Eight of the 10 highest COVID-19 death rates adjusted for age have GOP governors, as do nine of the 10 states with the worst vaccination rates, and a recent report found murder rates were 40 percent higher per capita in states won by Donald Trump, and eight of the 10 states with the highest murder rates in 2020 backed the Republican presidential nominee this century.
"Whatever the specific reason, it’s clear the governing philosophy of right-wing states (e.g., low spending; prioritization of cultural wedge issues; anti-elitism) leads to deadly results," Rubin wrote. "Maybe it’s time they stop spending their political energy persecuting gay kids, banning books, outlawing abortion and fanning culture wars. They have plenty of systemic problems they’ve failed to address while busying themselves with MAGA crusades."
"Red-state voters should look around and see why their states have fallen so far behind in so many categories," she added.
excerpt: The Red State Murder Problem
third way
Published March 15, 2022
Every news outlet from FOX to CNN to The New York Times to local newspapers has a story with attention-grabbing headlines like “US cities hit all-time murder records.” Fox News and Republicans have jumped on this and framed it as a “Democrat” problem. They blame it on Democrat’s “soft-on-crime” approach and have even referred to a New York District Attorney’s approach as “hug-a-thug.” Many news stories outside of Fox have also purported that police reform is responsible for this rise in murder and have pointed to cities like New York and Los Angeles.
There is a measure of truth to these stories. The US saw an alarming 30% increase in murder in 2020. While 2021 data is not yet complete, murder was on the rise again this past year. Some “blue” cities, like Chicago, Baltimore, and Philadelphia, have seen real and persistent increases in homicides. These cities—along with others like Los Angeles, New York, and Minneapolis—are also in places with wall-to-wall media coverage and national media interest.
But there is a large piece of the homicide story that is missing and calls into question the veracity of the right-wing obsession over homicides in Democratic cities: murder rates are far higher in Trump-voting red states than Biden-voting blue states. And sometimes, murder rates are highest in cities with Republican mayors.
For example, Jacksonville, a city with a Republican mayor, had 128 more murders in 2020 than San Francisco, a city with a Democrat mayor, despite their comparable populations. In fact, the homicide rate in Speaker Nancy Pelosi’s San Francisco was half that of House Republican Leader Kevin McCarthy’s Bakersfield, a city with a Republican mayor that overwhelmingly voted for Trump. Yet there is barely a whisper, let alone an outcry, over the stunning levels of murders in these and other places.
---
We found that murder rates are, on average, 40% higher in the 25 states Donald Trump won in the last presidential election compared to those that voted for Joe Biden. In addition, murder rates in many of these red states dwarf those in blue states like New York, California, and Massachusetts. And finally, many of the states with the worst murder rates—like Mississippi, Kentucky, Alabama, South Carolina, and Arkansas—are ones that few would describe as urban. Only 2 of America’s top 100 cities in population are located in these high murder rate states. And not a single one of the top 10 murder states registers in the top 15 for population density.
---
Trump-Voting States Account for 8 out of the 10 Highest Murder Rates in 2020
.If you’re tuned in to the media, you’d think murder is rocketing skyward in New York, California, Illinois. But those states don’t even crack the top ten.
In fact, the top per capita murder rate states in 2020 were mostly those far from massive urban centers and Democratic mayors and governors. Eight of the top ten worst murder rate states voted for Trump in 2020. None of those eight has supported a Democrat for president since 1996.
The chart below shows the top 10 murder rate states in 2020. Mississippi had the highest homicide rate at 20.50 murders per 100,000 residents, followed by Louisiana at 15.79, Kentucky at 14.32, Alabama at 14.2, and Missouri at 14. The national average was 6.5 per 100,000 residents, but the top five states had rates more than twice that high.
These red states are not generating “murder is out of control” national headlines. They seem to generate no headlines at all. The rest of the top ten were filled out by South Carolina, New Mexico, Georgia, Arkansas, and Tennessee—all states rarely talked about in breathless media reports about rampant crime in Democratic strongholds. Notably, New Mexico and Georgia were the only Biden-voting states in the top ten, and they ranked seventh and eighth, respectively.
Five of the largest Biden-voting states by population, and those often in the news when it comes to crime, had much lower murder rates. New York at 4.11 per 100,000 residents, California at 5.59, and New Jersey at 3.70 were each well below the national average. Pennsylvania (7.22) and Illinois (9.20) were higher than the national average. But Mississippi’s murder rate was nearly 400% higher than New York’s, more than 250% higher than California’s, and about 120% higher than Illinois’s. In fact, the five states with the highest murder rates, all Trump-voting states, had rates at least 240% higher than New York’s murder rate and at least 150% higher than California’s, the homes to some of the largest cities featured prominently in the “crime is out of control” narrative.
---
2020 Murder Rates Are 40% Higher in Trump-Voting States Compared to Biden-Voting States.
Beyond the top 10, we looked at the 2020 murder rates in the 25 states that voted for Donald Trump and compared it with the murder rates in the 25 states that voted for Joe Biden. The 8.20 murders per 100,000 residents rate in Trump states was 40% higher than the 5.78 murders per 100,000 residents in Biden states. These Biden-voting states include the “crime-is-out-of-control” cities of Los Angeles, New York City, Chicago, Detroit, Philadelphia, Portland, Baltimore, and Minneapolis, among other large cities.
Among the 50 states, murder rates were often well above the national average in many Republican-controlled states and cities. Jacksonville with 176 homicides and a murder rate (19.776) more than three times that of New York City (5.94) has a Republican mayor. Tulsa (19.64) and Oklahoma City (11.16) have Republican mayors in a Republican state and have murder rates that dwarf that of Los Angeles (6.74). Lexington’s Republican mayor saw record homicides in 2020 and 2021, with a murder rate (10.61) nearly twice that of New York City. Bakersfield (11.91) and Fresno (14.09) each have Republican mayors and murder rates far higher than either San Francisco or Los Angeles.
Of course, some cities controlled by Democrats have alarming murder rates, like Chicago (28.49) and Houston (17.32). But we hear about these and other Democrat-run cities all the time. We aren’t getting the whole picture.
[...]
There is a measure of truth to these stories. The US saw an alarming 30% increase in murder in 2020. While 2021 data is not yet complete, murder was on the rise again this past year. Some “blue” cities, like Chicago, Baltimore, and Philadelphia, have seen real and persistent increases in homicides. These cities—along with others like Los Angeles, New York, and Minneapolis—are also in places with wall-to-wall media coverage and national media interest.
But there is a large piece of the homicide story that is missing and calls into question the veracity of the right-wing obsession over homicides in Democratic cities: murder rates are far higher in Trump-voting red states than Biden-voting blue states. And sometimes, murder rates are highest in cities with Republican mayors.
For example, Jacksonville, a city with a Republican mayor, had 128 more murders in 2020 than San Francisco, a city with a Democrat mayor, despite their comparable populations. In fact, the homicide rate in Speaker Nancy Pelosi’s San Francisco was half that of House Republican Leader Kevin McCarthy’s Bakersfield, a city with a Republican mayor that overwhelmingly voted for Trump. Yet there is barely a whisper, let alone an outcry, over the stunning levels of murders in these and other places.
---
We found that murder rates are, on average, 40% higher in the 25 states Donald Trump won in the last presidential election compared to those that voted for Joe Biden. In addition, murder rates in many of these red states dwarf those in blue states like New York, California, and Massachusetts. And finally, many of the states with the worst murder rates—like Mississippi, Kentucky, Alabama, South Carolina, and Arkansas—are ones that few would describe as urban. Only 2 of America’s top 100 cities in population are located in these high murder rate states. And not a single one of the top 10 murder states registers in the top 15 for population density.
---
Trump-Voting States Account for 8 out of the 10 Highest Murder Rates in 2020
.If you’re tuned in to the media, you’d think murder is rocketing skyward in New York, California, Illinois. But those states don’t even crack the top ten.
In fact, the top per capita murder rate states in 2020 were mostly those far from massive urban centers and Democratic mayors and governors. Eight of the top ten worst murder rate states voted for Trump in 2020. None of those eight has supported a Democrat for president since 1996.
The chart below shows the top 10 murder rate states in 2020. Mississippi had the highest homicide rate at 20.50 murders per 100,000 residents, followed by Louisiana at 15.79, Kentucky at 14.32, Alabama at 14.2, and Missouri at 14. The national average was 6.5 per 100,000 residents, but the top five states had rates more than twice that high.
These red states are not generating “murder is out of control” national headlines. They seem to generate no headlines at all. The rest of the top ten were filled out by South Carolina, New Mexico, Georgia, Arkansas, and Tennessee—all states rarely talked about in breathless media reports about rampant crime in Democratic strongholds. Notably, New Mexico and Georgia were the only Biden-voting states in the top ten, and they ranked seventh and eighth, respectively.
Five of the largest Biden-voting states by population, and those often in the news when it comes to crime, had much lower murder rates. New York at 4.11 per 100,000 residents, California at 5.59, and New Jersey at 3.70 were each well below the national average. Pennsylvania (7.22) and Illinois (9.20) were higher than the national average. But Mississippi’s murder rate was nearly 400% higher than New York’s, more than 250% higher than California’s, and about 120% higher than Illinois’s. In fact, the five states with the highest murder rates, all Trump-voting states, had rates at least 240% higher than New York’s murder rate and at least 150% higher than California’s, the homes to some of the largest cities featured prominently in the “crime is out of control” narrative.
---
2020 Murder Rates Are 40% Higher in Trump-Voting States Compared to Biden-Voting States.
Beyond the top 10, we looked at the 2020 murder rates in the 25 states that voted for Donald Trump and compared it with the murder rates in the 25 states that voted for Joe Biden. The 8.20 murders per 100,000 residents rate in Trump states was 40% higher than the 5.78 murders per 100,000 residents in Biden states. These Biden-voting states include the “crime-is-out-of-control” cities of Los Angeles, New York City, Chicago, Detroit, Philadelphia, Portland, Baltimore, and Minneapolis, among other large cities.
Among the 50 states, murder rates were often well above the national average in many Republican-controlled states and cities. Jacksonville with 176 homicides and a murder rate (19.776) more than three times that of New York City (5.94) has a Republican mayor. Tulsa (19.64) and Oklahoma City (11.16) have Republican mayors in a Republican state and have murder rates that dwarf that of Los Angeles (6.74). Lexington’s Republican mayor saw record homicides in 2020 and 2021, with a murder rate (10.61) nearly twice that of New York City. Bakersfield (11.91) and Fresno (14.09) each have Republican mayors and murder rates far higher than either San Francisco or Los Angeles.
Of course, some cities controlled by Democrats have alarming murder rates, like Chicago (28.49) and Houston (17.32). But we hear about these and other Democrat-run cities all the time. We aren’t getting the whole picture.
[...]
'Conflicted Congress': Key findings from Insider's five-month investigation into federal lawmakers' personal finances
[email protected] (Dave Levinthal) - business insider
12/13/2021
The nation is unabashedly polarized. Republicans and Democrats enjoy little goodwill and less commonality.
But in Washington, DC, a bipartisan phenomenon is thriving. Numerous members of Congress, both liberal and conservative, are united in their demonstrated indifference toward a law designed to quash corruption and curb conflicts-of-interest.
Insider's new investigative reporting project, "Conflicted Congress," chronicles the myriad ways members of the US House and Senate have eviscerated their own ethical standards, avoided consequences, and blinded Americans to the many moments when lawmakers' personal finances clash with their public duties.
In all, Insider spent hundreds of hours over five months reviewing nearly 9,000 financial-disclosure reports for every sitting lawmaker and their top-ranking staffers. Reporters conducted hundreds of interviews, including those with some of the nation's most powerful leaders.
Today, Insider published the first of more than two-dozen articles and data visualizations that will reveal the:
Insider's "Conflicted Congress" is also rating every member of Congress on their financial conflicts and commitment to financial transparency. Fourteen senators and House members have received a red "danger" rating on our three-tier stoplight scale, while 112 get a yellow "borderline" rating.
Throughout this week, "Conflicted Congress" will publish investigations into Congress' tobacco ties, cryptocurrency plays, real estate investments, transparency avoidance, lax law enforcement, and crushing student loan debt.
Other articles will reveal the 25 wealthiest members of Congress and where they put their money and the 50 most popular stock holdings among members of Congress.
Finally, Insider on Friday will publish an exclusive, searchable, and sortable database of all members of Congress' personal finances, including their assets, debts, and sources of outside income. (Data geeks get ready!)
Read the original article on Business Insider
- Dozens of federal lawmakers and at least 182 top staffers have violated a conflict-of-interest law.
- Numerous members of Congress personally invest in industries they oversee.
- Few face serious consequences, legally or otherwise.
The nation is unabashedly polarized. Republicans and Democrats enjoy little goodwill and less commonality.
But in Washington, DC, a bipartisan phenomenon is thriving. Numerous members of Congress, both liberal and conservative, are united in their demonstrated indifference toward a law designed to quash corruption and curb conflicts-of-interest.
Insider's new investigative reporting project, "Conflicted Congress," chronicles the myriad ways members of the US House and Senate have eviscerated their own ethical standards, avoided consequences, and blinded Americans to the many moments when lawmakers' personal finances clash with their public duties.
In all, Insider spent hundreds of hours over five months reviewing nearly 9,000 financial-disclosure reports for every sitting lawmaker and their top-ranking staffers. Reporters conducted hundreds of interviews, including those with some of the nation's most powerful leaders.
Today, Insider published the first of more than two-dozen articles and data visualizations that will reveal the:
- 48 members of Congress and 182 senior-level congressional staffers who have violated a federal conflicts-of-interest law.
- Nearly 75 federal lawmakers who held stocks in COVID-19 vaccine makers Moderna, Johnson & Johnson, or Pfizer in 2020, with many of them buying or selling these stocks in the early weeks of the pandemic.
- 15 lawmakers tasked with shaping US defense policy that actively invest in military contractors.
- More than a dozen environmentally-minded Democrats who invest in fossil fuel companies or other corporations with concerning environmental track records.
- Members who regularly chide "the media" but personally pour their money into at least one of the nation's largest news media or social media companies, including Facebook, Twitter, Comcast, Disney, and the New York Times Co.
Insider's "Conflicted Congress" is also rating every member of Congress on their financial conflicts and commitment to financial transparency. Fourteen senators and House members have received a red "danger" rating on our three-tier stoplight scale, while 112 get a yellow "borderline" rating.
Throughout this week, "Conflicted Congress" will publish investigations into Congress' tobacco ties, cryptocurrency plays, real estate investments, transparency avoidance, lax law enforcement, and crushing student loan debt.
Other articles will reveal the 25 wealthiest members of Congress and where they put their money and the 50 most popular stock holdings among members of Congress.
Finally, Insider on Friday will publish an exclusive, searchable, and sortable database of all members of Congress' personal finances, including their assets, debts, and sources of outside income. (Data geeks get ready!)
Read the original article on Business Insider
We Could Vaccinate the World 3 Times Over If the Rich Paid the Taxes They Owe
BY Jake Johnson, Common Dreams
PUBLISHED November 16, 2021
Ending abuses of the global tax system by the super-rich and multinational corporations would allow countries to recoup nearly half a trillion dollars in revenue each year — enough to vaccinate the world’s population against Covid-19 three times over.
That estimate is courtesy of The State of Tax Justice 2021, a new report that argues rich countries — not the “palm-fringed islands” on the European Union’s tax haven blacklist — are the primary enablers of offshoring by large companies and tax evasion by wealthy individuals.
According to the report, members of the Organization for Economic Cooperation and Development (OECD) deserve “the lion’s share of blame” for permitting rampant abuses of the global tax system, which has become increasingly leaky in recent decades as countries have altered their laws to better serve the interests of the well-off.
Produced by the Tax Justice Network, the Global Alliance for Tax Justice, and Public Services International, the new report finds that $312 billion annually is “lost to cross-border corporate tax abuse by multinational corporations and $171 billion is lost to offshore tax evasion by wealthy individuals.”
“Higher-income countries are responsible for over 99% of all tax lost around the world in a year to offshore wealth tax evasion,” the report notes. “Lower-income countries are responsible for less than 1%.”
The total $483 billion in tax revenue lost to offshoring and evasion each year is only “the tip of the iceberg,” said Tax Justice Network data scientist Miroslav Palanský, who stressed that the estimate is just “what we can see above the surface thanks to some recent progress on tax transparency.”
“We know there’s a lot more tax abuse below the surface costing magnitudes more in tax losses,” he added.
Among OECD members, the United Kingdom and its so-called “spider web” of tax havens — along with the Netherlands, Luxembourg, and Switzerland — are the world’s worst enablers of global tax abuses, according to the new analysis, which comes weeks after the Pandora Papers further exposed how world leaders, celebrities, and billionaire business moguls are exploiting tax havens to shield trillions of dollars in assets.
While offshoring and evasion cost rich countries more money in absolute terms than poor nations, “their tax losses represent a smaller share of their revenues — 9.7% of their collective public health budgets.”
“Lower-income countries in comparison lose less tax in absolute terms, $39.7 billion a year, but their losses account for a much higher share of their current tax revenues and spending,” the new analysis finds. “Collectively, lower-income countries lose the equivalent of nearly half (48%) of their public health budgets — and unlike OECD members, they have little or no say on the international rules that continue to allow these abuses.”
The report estimates that the revenue poor countries lose to tax abuses on a yearly basis “would be enough to vaccinate 60% of their populations, bridging the gap in vaccination rates between lower-income and higher-income countries.”
Dr. Dereje Alemayehu, executive coordinator of the Global Alliance for Tax Justice, said in a statement that “the richest countries, much like their colonial forebearers, have appointed themselves as the only ones capable of governing on international tax, draped themselves in the robes of saviors, and set loose the wealthy and powerful to bleed the poorest countries dry.”
“To tackle global inequality,” said Alemayehu, “we must tackle the inequality in power over global tax rules.”
One way to do that, the new report argues, is to shift tax-setting authority away from the OECD — “a small club of rich countries” — to the United Nations.
Advocates say the case for such a move has become even more compelling since October, when OECD members agreed to a new global tax framework that would do little to meaningfully crack down on tax dodging by massive corporations.
Additionally, the new report recommends an excess profits tax on multinational corporations and a wealth tax designed “to fund the Covid-19 response and address the long-term inequalities the pandemic has exacerbated.”
“Another year of the pandemic, and another half-trillion dollars snatched by the wealthiest multinational corporations and individuals from public purses around the world,” Alex Cobham, chief executive at the Tax Justice Network, said in a statement. “Tax can be our most powerful tool for tackling inequality, but instead it’s been made entirely optional for the super-rich.”
“We must reprogram the global tax system to protect people’s wellbeing and livelihoods over the desires of the wealthiest,” Cobham added, “or the cruel inequalities exposed by the pandemic will be locked in for good.”
That estimate is courtesy of The State of Tax Justice 2021, a new report that argues rich countries — not the “palm-fringed islands” on the European Union’s tax haven blacklist — are the primary enablers of offshoring by large companies and tax evasion by wealthy individuals.
According to the report, members of the Organization for Economic Cooperation and Development (OECD) deserve “the lion’s share of blame” for permitting rampant abuses of the global tax system, which has become increasingly leaky in recent decades as countries have altered their laws to better serve the interests of the well-off.
Produced by the Tax Justice Network, the Global Alliance for Tax Justice, and Public Services International, the new report finds that $312 billion annually is “lost to cross-border corporate tax abuse by multinational corporations and $171 billion is lost to offshore tax evasion by wealthy individuals.”
“Higher-income countries are responsible for over 99% of all tax lost around the world in a year to offshore wealth tax evasion,” the report notes. “Lower-income countries are responsible for less than 1%.”
The total $483 billion in tax revenue lost to offshoring and evasion each year is only “the tip of the iceberg,” said Tax Justice Network data scientist Miroslav Palanský, who stressed that the estimate is just “what we can see above the surface thanks to some recent progress on tax transparency.”
“We know there’s a lot more tax abuse below the surface costing magnitudes more in tax losses,” he added.
Among OECD members, the United Kingdom and its so-called “spider web” of tax havens — along with the Netherlands, Luxembourg, and Switzerland — are the world’s worst enablers of global tax abuses, according to the new analysis, which comes weeks after the Pandora Papers further exposed how world leaders, celebrities, and billionaire business moguls are exploiting tax havens to shield trillions of dollars in assets.
While offshoring and evasion cost rich countries more money in absolute terms than poor nations, “their tax losses represent a smaller share of their revenues — 9.7% of their collective public health budgets.”
“Lower-income countries in comparison lose less tax in absolute terms, $39.7 billion a year, but their losses account for a much higher share of their current tax revenues and spending,” the new analysis finds. “Collectively, lower-income countries lose the equivalent of nearly half (48%) of their public health budgets — and unlike OECD members, they have little or no say on the international rules that continue to allow these abuses.”
The report estimates that the revenue poor countries lose to tax abuses on a yearly basis “would be enough to vaccinate 60% of their populations, bridging the gap in vaccination rates between lower-income and higher-income countries.”
Dr. Dereje Alemayehu, executive coordinator of the Global Alliance for Tax Justice, said in a statement that “the richest countries, much like their colonial forebearers, have appointed themselves as the only ones capable of governing on international tax, draped themselves in the robes of saviors, and set loose the wealthy and powerful to bleed the poorest countries dry.”
“To tackle global inequality,” said Alemayehu, “we must tackle the inequality in power over global tax rules.”
One way to do that, the new report argues, is to shift tax-setting authority away from the OECD — “a small club of rich countries” — to the United Nations.
Advocates say the case for such a move has become even more compelling since October, when OECD members agreed to a new global tax framework that would do little to meaningfully crack down on tax dodging by massive corporations.
Additionally, the new report recommends an excess profits tax on multinational corporations and a wealth tax designed “to fund the Covid-19 response and address the long-term inequalities the pandemic has exacerbated.”
“Another year of the pandemic, and another half-trillion dollars snatched by the wealthiest multinational corporations and individuals from public purses around the world,” Alex Cobham, chief executive at the Tax Justice Network, said in a statement. “Tax can be our most powerful tool for tackling inequality, but instead it’s been made entirely optional for the super-rich.”
“We must reprogram the global tax system to protect people’s wellbeing and livelihoods over the desires of the wealthiest,” Cobham added, “or the cruel inequalities exposed by the pandemic will be locked in for good.”
The seedy crimes of the obscenely rich are routinely ignored
John Stoehr -commentary-raw story
October 30, 2021
Imagine a world in which two things are true. One, you can make piles of cash as a direct result of breaking federal law. Think of it as theft by other means. Two, you won't ever get caught or be punished. Think of it as a veto on the rule of law. I'm not talking about illicit drug cartels. I'm talking about the respectable world of the very obscenely rich.
In fact, according to Businessweek, the heads of the country's biggest corporate firms almost never face investigation and prosecution by the federal government for using insider information on the stock market. That's despite their portfolios almost always beating the markets.
Trading on information that has not been made public is illegal. Yet it happens all the time. According to reporter Liam Vaughn, federal regulators seem to think it's not a big deal. Lax oversight encourages the already long-held view that "trading on sensitive information was widely considered a perk of being an executive at a publicly traded company, and that thinking seems to persist," even among the feds.
"A growing body of research suggests that many insiders are trading well thanks to more than luck or judgment," Vaughn wrote for the Oct. 4 issue. "It indicates that insider trading by executives is pervasive and that nobody — not the regulators, not the Department of Justice, not the companies themselves — is doing anything to stop it."
Vaughn reported a study by Daniel Taylor, a professor at the Wharton School. He and his co-authors found that "insiders who traded were able to avoid significant losses, particularly in instances when a company's results ended up having to be restated. Time and time again, 'insiders appear to exploit private information' for 'opportunistic gain.' … Cheating, they'd discovered, seemed to be everywhere."
The popular view, advanced most vocally by the Republican Party, is the very obscenely rich deserve being very obscenely rich. They earned their wealth. They built their companies. They invested wisely. They created jobs. They are the backbone of the American economy. They have created the impression that what's good for the very obscenely rich is good for America. And anything that's bad for the very obscenely rich is bad for America. Just ask Tilman Fertitta, owner of Landry's, a national dining, hospitality and gaming corporation.
Asked what he thought of raising taxes on the very obscenely rich, Fertitta told Fox: "All it's going to do is make me not build as much because I won't have the ability to create so many more jobs and then you are paying so many different taxes. Every employee pays: payroll taxes, all your sales taxes. All the taxes they pay. It's truly a mistake."
It's truly a lie.
The reality is sinister. When it comes to insider trading, the very obscenely rich are "profiting at the expense of regular people," thus breaking "America's basic bargain," wrote Preet Bharara, former US attorney for the southern district of New York, which covers Wall Street. In a 2018 op-ed for the Times, he said the very obscenely rich "should not have an unfair advantage over the everyday citizen."
Let's stop being so polite, though.
The very obscenely rich do not represent America. According to new data from the Federal Reserve, reported recently by CNBC, nearly 90 percent of everything being bought and sold on Wall Street is being bought and sold by the wealthiest 10 percent. That suggests, to me anyway, financial markets are separate and distinct from the regular economy, and anyone telling you that what's bad for the very obscenely rich is bad for America is someone trying to scam you.
Why wouldn't they? Not only do they have the blessing of the United States government, in the form of its doing nothing to stop rampant criminal behavior, but apparently, to be a member of the exclusive club of the very obscenely rich requires a capacity for cheating, lying and subterfuge that would scandalized most normal people, especially given the widespread understanding, advanced most vocally by the Republican Party, that the very obscenely rich are respectable, admirable, wise — nearly superhuman amid such fame and wealth.
Indeed, our culture venerates criminal minds, as long as they are white and rich, to such a degree that we mind when the very obscenely rich insult us to our faces. Fertitta, who owns the Houston Rockets, added: "We have 4,000 (job) openings right now. Between the Golden Nugget, all the restaurants and entertainment venues; people just don't want to work anymore. I don't know what happened to that part of capitalism."
People do want to work. They just want to be paid a fair wage. But even if they didn't want to work, who could blame? After all, the people who they're supposed to admire didn't earn their wealth. They cheated.
In fact, according to Businessweek, the heads of the country's biggest corporate firms almost never face investigation and prosecution by the federal government for using insider information on the stock market. That's despite their portfolios almost always beating the markets.
Trading on information that has not been made public is illegal. Yet it happens all the time. According to reporter Liam Vaughn, federal regulators seem to think it's not a big deal. Lax oversight encourages the already long-held view that "trading on sensitive information was widely considered a perk of being an executive at a publicly traded company, and that thinking seems to persist," even among the feds.
"A growing body of research suggests that many insiders are trading well thanks to more than luck or judgment," Vaughn wrote for the Oct. 4 issue. "It indicates that insider trading by executives is pervasive and that nobody — not the regulators, not the Department of Justice, not the companies themselves — is doing anything to stop it."
Vaughn reported a study by Daniel Taylor, a professor at the Wharton School. He and his co-authors found that "insiders who traded were able to avoid significant losses, particularly in instances when a company's results ended up having to be restated. Time and time again, 'insiders appear to exploit private information' for 'opportunistic gain.' … Cheating, they'd discovered, seemed to be everywhere."
The popular view, advanced most vocally by the Republican Party, is the very obscenely rich deserve being very obscenely rich. They earned their wealth. They built their companies. They invested wisely. They created jobs. They are the backbone of the American economy. They have created the impression that what's good for the very obscenely rich is good for America. And anything that's bad for the very obscenely rich is bad for America. Just ask Tilman Fertitta, owner of Landry's, a national dining, hospitality and gaming corporation.
Asked what he thought of raising taxes on the very obscenely rich, Fertitta told Fox: "All it's going to do is make me not build as much because I won't have the ability to create so many more jobs and then you are paying so many different taxes. Every employee pays: payroll taxes, all your sales taxes. All the taxes they pay. It's truly a mistake."
It's truly a lie.
The reality is sinister. When it comes to insider trading, the very obscenely rich are "profiting at the expense of regular people," thus breaking "America's basic bargain," wrote Preet Bharara, former US attorney for the southern district of New York, which covers Wall Street. In a 2018 op-ed for the Times, he said the very obscenely rich "should not have an unfair advantage over the everyday citizen."
Let's stop being so polite, though.
The very obscenely rich do not represent America. According to new data from the Federal Reserve, reported recently by CNBC, nearly 90 percent of everything being bought and sold on Wall Street is being bought and sold by the wealthiest 10 percent. That suggests, to me anyway, financial markets are separate and distinct from the regular economy, and anyone telling you that what's bad for the very obscenely rich is bad for America is someone trying to scam you.
Why wouldn't they? Not only do they have the blessing of the United States government, in the form of its doing nothing to stop rampant criminal behavior, but apparently, to be a member of the exclusive club of the very obscenely rich requires a capacity for cheating, lying and subterfuge that would scandalized most normal people, especially given the widespread understanding, advanced most vocally by the Republican Party, that the very obscenely rich are respectable, admirable, wise — nearly superhuman amid such fame and wealth.
Indeed, our culture venerates criminal minds, as long as they are white and rich, to such a degree that we mind when the very obscenely rich insult us to our faces. Fertitta, who owns the Houston Rockets, added: "We have 4,000 (job) openings right now. Between the Golden Nugget, all the restaurants and entertainment venues; people just don't want to work anymore. I don't know what happened to that part of capitalism."
People do want to work. They just want to be paid a fair wage. But even if they didn't want to work, who could blame? After all, the people who they're supposed to admire didn't earn their wealth. They cheated.
The 'job creators' fantasy is a malignant myth that rich use to squeeze the working class
Joe Maniscalco, DC Report @ Raw Story
September 06, 2021
Those now lining up against the Biden administration's $3.5 trillion Reconciliation Bill represent the same small segment of society that has always demanded working families surrender to the wants and desires of the so-called "job creators."
The "job creator" fantasy is a malignant, but persistent myth that the corporate class and their power-suited courtiers use to squeeze the working class and extract their wealth.
And they aren't abandoning it now that federal unemployment benefits have expired and there's nothing preventing landlords from spending Labor Day kicking millions of struggling American families out onto the streets.
Donald Trump's $1.9 trillion Tax Cuts and Jobs Act has already been exposed as the toxic fraud and giveaway to the ultra-rich it's always been — while covid has made the need for the largest public investment in working families since Franklin Roosevelt's New Deal abundantly clear.
North Carolina Rep. Virginia Foxx, the Republican minority leader of the Education and Labor Committee, nevertheless, seized hold of August's underwhelming jobs report (as if the figures, in and of themselves, adequately reflected the reality of working people's lives) to further prop up the "job creator" myth.
"This is what happens when you reward people for staying at home and throw gasoline on an inflation fire with socialist spending packages," the 15-year House fixture said in a pre-Labor Day statement. "If Congressional Democrats and the Biden administration really want to help struggling Americans, they will get out of the job creators' way and suspend with [Majority Leader Nancy] Pelosi's $3.5 trillion spending spree which will double down on the worst of today's jobs report."
According to Forbes, the United States of America created nearly 100 new billionaires between March 2020 and March 2021. Worldwide, the figure approached nearly 500. All told, the world's super-rich stuffed another $5.5 trillion into their already bloated multi-trillion-dollar coffers.
The rich are making out like bandits during the pandemic. Is it any wonder that working men and women who expected to imperil their own lives and the lives of their families to keep the gravy train going for billionaire corporations are increasingly unwilling to do it anymore?
When the ever-business savvy U.S. Chamber of Commerce cynically decided last month to back the bipartisan $1.2 trillion Infrastructure Investment and Jobs Act—also pending in Congress—the Chamber called it a necessary measure to help America "remain competitive" with China and "the most fiscally responsible infrastructure package in at least a decade."
By framing its support in this kind of "robust" free-market faux language, the Chamber is attempting to help shield the great "job creators" myth, while also sparing one-percenters the pain of being more heavily taxed under the $3.5 trillion Reconciliation Bill — a significantly heftier congressional measure that includes money for "soft" infrastructure items like paid leave, child care, education and health care.
Related Companies head Stephen Ross--great friend and enabler of Donald Trump—invoked the great "job creator" myth in his more than two-year quest to largely cut out the Building and Construction Trades Council of Greater New York (BCTC) from phase two of the largest private real estate development in U.S. history — the suicide-plagued Hudson Yards project, located on Manhattan's West Side.
Before the two parties reached an "accord" in 2019, furious trade unionists irate at losing jobs at Hudson Yards to cheap nonunion labor, dogged Ross and Related Cos. all around town, subjecting themselves to mass arrest, filling Sixth Avenue with thousands of rank & file protesters, and, at one point, blasting the developer as a racist union-buster during a live telecast of Fox NFL Thursday Night Football.
All the sustained union militancy prompted Joanna Rose Related Companies Executive VP of Corporate Affairs to snap, "The BCTC does not create jobs — we do."
Rose's cranky response, however, belied the billions of dollars in public subsidies, tax breaks and loans the Hudson Yards project enjoyed. Young New York City trade unionists still at the start of their building careers weren't fooled, however.
"[The property] comes from the city to begin with — that's the common ownership of the people," Laborers Local 79 member Freddie Bastone said during the protests. "And the workers, themselves, have always produced the wealth in this city. We're talking about the largest real estate development in all of American history, and the financing going into that comes right out of our pockets. And we built those buildings — we build New York. So, we're the ones who create the real wealth — they don't create anything. If we're going to have class warfare brought to us, we're going to have to bring it to them."
Fellow Laborers Local 79 member Tafadar Sourov put the situation in further perspective.
"What the developers are doing…they supervise the process of bringing together the money and the workers…but if we're really talking about who makes the job run, who actually makes sure all this wealth is used to produce — it's us. We're the producers," he said. "We're the workers. Nothing happens without us."
The $1.2 trillion Infrastructure Investment and Jobs Act is projected to create one million middle-class jobs over the next decade in engineering, accounting and construction. The 3.5 trillion Reconciliation Bill, part of Joe Biden's Build Back Better program, significantly expands the social safety net, while reportedly creating two million jobs annually.
All of these jobs are a far cry from the low-wage, junk jobs the mythic "job creators" usually produce. While the federal minimum wage has been stuck at $7.25 since 2009, the average S&P 500 company CEO-to-worker pay ratio last year rose to 299 to 1.
The Joe Manchins of the world, however, are spending this pivotal time in the nation's history fretting about the deficit and warning that the economy is on the verge of "overheating" with "millions of jobs" across the country going unfilled.
Could it be that those vaunted "job creators" just aren't any good at creating decent jobs?
The "job creator" fantasy is a malignant, but persistent myth that the corporate class and their power-suited courtiers use to squeeze the working class and extract their wealth.
And they aren't abandoning it now that federal unemployment benefits have expired and there's nothing preventing landlords from spending Labor Day kicking millions of struggling American families out onto the streets.
Donald Trump's $1.9 trillion Tax Cuts and Jobs Act has already been exposed as the toxic fraud and giveaway to the ultra-rich it's always been — while covid has made the need for the largest public investment in working families since Franklin Roosevelt's New Deal abundantly clear.
North Carolina Rep. Virginia Foxx, the Republican minority leader of the Education and Labor Committee, nevertheless, seized hold of August's underwhelming jobs report (as if the figures, in and of themselves, adequately reflected the reality of working people's lives) to further prop up the "job creator" myth.
"This is what happens when you reward people for staying at home and throw gasoline on an inflation fire with socialist spending packages," the 15-year House fixture said in a pre-Labor Day statement. "If Congressional Democrats and the Biden administration really want to help struggling Americans, they will get out of the job creators' way and suspend with [Majority Leader Nancy] Pelosi's $3.5 trillion spending spree which will double down on the worst of today's jobs report."
According to Forbes, the United States of America created nearly 100 new billionaires between March 2020 and March 2021. Worldwide, the figure approached nearly 500. All told, the world's super-rich stuffed another $5.5 trillion into their already bloated multi-trillion-dollar coffers.
The rich are making out like bandits during the pandemic. Is it any wonder that working men and women who expected to imperil their own lives and the lives of their families to keep the gravy train going for billionaire corporations are increasingly unwilling to do it anymore?
When the ever-business savvy U.S. Chamber of Commerce cynically decided last month to back the bipartisan $1.2 trillion Infrastructure Investment and Jobs Act—also pending in Congress—the Chamber called it a necessary measure to help America "remain competitive" with China and "the most fiscally responsible infrastructure package in at least a decade."
By framing its support in this kind of "robust" free-market faux language, the Chamber is attempting to help shield the great "job creators" myth, while also sparing one-percenters the pain of being more heavily taxed under the $3.5 trillion Reconciliation Bill — a significantly heftier congressional measure that includes money for "soft" infrastructure items like paid leave, child care, education and health care.
Related Companies head Stephen Ross--great friend and enabler of Donald Trump—invoked the great "job creator" myth in his more than two-year quest to largely cut out the Building and Construction Trades Council of Greater New York (BCTC) from phase two of the largest private real estate development in U.S. history — the suicide-plagued Hudson Yards project, located on Manhattan's West Side.
Before the two parties reached an "accord" in 2019, furious trade unionists irate at losing jobs at Hudson Yards to cheap nonunion labor, dogged Ross and Related Cos. all around town, subjecting themselves to mass arrest, filling Sixth Avenue with thousands of rank & file protesters, and, at one point, blasting the developer as a racist union-buster during a live telecast of Fox NFL Thursday Night Football.
All the sustained union militancy prompted Joanna Rose Related Companies Executive VP of Corporate Affairs to snap, "The BCTC does not create jobs — we do."
Rose's cranky response, however, belied the billions of dollars in public subsidies, tax breaks and loans the Hudson Yards project enjoyed. Young New York City trade unionists still at the start of their building careers weren't fooled, however.
"[The property] comes from the city to begin with — that's the common ownership of the people," Laborers Local 79 member Freddie Bastone said during the protests. "And the workers, themselves, have always produced the wealth in this city. We're talking about the largest real estate development in all of American history, and the financing going into that comes right out of our pockets. And we built those buildings — we build New York. So, we're the ones who create the real wealth — they don't create anything. If we're going to have class warfare brought to us, we're going to have to bring it to them."
Fellow Laborers Local 79 member Tafadar Sourov put the situation in further perspective.
"What the developers are doing…they supervise the process of bringing together the money and the workers…but if we're really talking about who makes the job run, who actually makes sure all this wealth is used to produce — it's us. We're the producers," he said. "We're the workers. Nothing happens without us."
The $1.2 trillion Infrastructure Investment and Jobs Act is projected to create one million middle-class jobs over the next decade in engineering, accounting and construction. The 3.5 trillion Reconciliation Bill, part of Joe Biden's Build Back Better program, significantly expands the social safety net, while reportedly creating two million jobs annually.
All of these jobs are a far cry from the low-wage, junk jobs the mythic "job creators" usually produce. While the federal minimum wage has been stuck at $7.25 since 2009, the average S&P 500 company CEO-to-worker pay ratio last year rose to 299 to 1.
The Joe Manchins of the world, however, are spending this pivotal time in the nation's history fretting about the deficit and warning that the economy is on the verge of "overheating" with "millions of jobs" across the country going unfilled.
Could it be that those vaunted "job creators" just aren't any good at creating decent jobs?
The Murder of the U.S. Middle Class Began 40 Years Ago This Week
Reagan’s firing of striking air traffic controllers was the first huge offensive in corporate America’s war on everyone else.
Jon Schwarz - the intercept
August 6 2021, 8:37 a.m
FORTY YEARS AGO, on August 5, 1981, President Ronald Reagan fired 11,345 striking air traffic controllers and barred them from ever working again for the federal government. By October of that year, the Professional Air Traffic Controllers Organization, or PATCO, the union that had called the strike, had been decertified and lay in ruins. The careers of most of the individual strikers were similarly dead: While Bill Clinton lifted Reagan’s ban on strikers in 1993, fewer than 10 percent were ever rehired by the Federal Aviation Administration.
PATCO was dominated by Vietnam War-era veterans who’d learned air traffic control in the military and were one of a vanishingly small number of unions to endorse Reagan in 1980, thereby scoring one of the greatest own goals in political history. It’s easy to imagine strikers expressing the same sentiments as a Trump voter who famously lamented, “I thought he was going to do good things. He’s not hurting the people he needs to be hurting.”
The PATCO saga began in February 1981, when negotiations began between the union and the FAA on a new contract. PATCO proposed changes including a 32-hour workweek and a big increase in pay. The FAA came back with counterproposals the union deemed insufficient, and on August 3, with bargaining at an impasse, most of the air traffic controllers walked out.
It was unquestionably illegal for PATCO, as a union of government workers, to strike. However, which laws are enforced is always and everywhere a political decision: Wall Street firms broke countless laws in the run-up to the 2008 financial crisis, yet almost no executives suffered any consequences. Reagan & Co. wanted to send a message that mere workers could expect no such forbearance. Just two days after the strike began, the air traffic controllers were gone.
The significance of Reagan’s actions is rarely discussed today in the mainstream, and for understandable reasons: It was the first huge offensive in a war that corporate America has been waging on this country’s middle class ever since. As Warren Buffett — current estimated net worth $101 billion — has said, “There’s class warfare, all right, but it’s my class, the rich class, that’s making war, and we’re winning.”
The stunning victory of the wealthy over everyone else can been measured in several straightforward ways. During a speech last May at a community college in Cleveland, Joe Biden explained one of them:
From 1948 after the war to 1979, productivity in America grew by 100 percent. We made more things with productivity. You know what the workers’ pay grew? By 100 percent. Since 1979, all of that changed. Productivity has grown four times faster than pay has grown. The basic bargain in this country has been broken.
Productivity is a simple but extremely important economic concept. Over time, as technology advances and society learns how to use it, each worker can produce more. One person with a bulldozer can move a lot more dirt than one person with a shovel. One person with the latest version of Microsoft Excel can do a lot more math than one person with Napier’s bones.
The meaning of Biden’s statistics is that for decades after World War II, America got much richer overall, and average worker pay went up at the same rate. Then the link between productivity and pay was severed: The U.S. overall continued to get much richer, but most of the increased wealth went to the top, not to normal people. Corporate CEOs, partners at corporate law firms, orthopedic surgeons — they make three, five, 10 times what they did in 1981. Nurses, firefighters, janitors, almost anyone without a college degree — their pay has barely budged.
The situation is especially egregious at the bottom of the pay scale. Until 1968, Congress increased the federal minimum wage in line with productivity. That year, it reached its highest level: Adjusted for inflation, it was the equivalent of $12 per hour today. It has since fallen to $7.25. Yet the whole story is far worse. Even as low-wage workers have battled fruitlessly to get the federal minimum wage raised to $15, no one realizes that if it had continued increasing along with productivity since 1968, it would now be over $24 per hour. At that level, a couple working full-time minimum wage jobs would take home $96,000 a year. This seems incredible, yet there are no economic reasons it couldn’t happen; we have simply made a political decision that it should not.
Another way to understand this is to look at the other end of American society. In 1995, Bill Gates had a net worth of $10 billion, worth about $18 billion in today’s dollars. That was enough to make him the richest person in America. If that were all Gates had today, there would be 25 or so billionaires ahead of him in line. Jeff Bezos, currently in first place, possesses 10 times Gates’s 1995 net worth.
Then there’s the number of significant strikes in the U.S. each year. A confident, powerful labor movement will generate large numbers of strikes; one terrorized and cowed into submission will not. According to the Labor Department, there were generally 200-400 large-scale strikes each year from 1947 to 1979. There were 187 in 1980. Then after the PATCO firing, the numbers fell off a cliff. In 1988, the last full year of Reagan’s second term, there were just 40 strikes. By 2017, there were seven.
The direct causal relationship between the firing of the air traffic controllers and the crushing of labor is widely noted and celebrated on the right. In a 2003 speech at the Reagan Library in California, then-Chair of the Federal Reserve Alan Greenspan spoke glowingly of the “flexibility” of U.S. labor markets, by which he meant “the freedom to fire.” Greenspan said that “perhaps the most important” contribution to these flexible markets “was the firing of the air traffic controllers in August 1981. … [Reagan’s] action gave weight to the legal right of private employers, previously not fully exercised, to use their own discretion to both hire and discharge workers.”
Donald Devine, the head of Reagan’s Office of Personnel Management at the time, later wrote, “American business leaders were given a lesson in managerial leadership [by Reagan] that they could not and did not ignore. Many private sector executives have told me that they were able to cut the fat from their organizations and adopt more competitive work practices because of what the government did in those days.”
The question today is whether the U.S. will ever go back to being the middle-class society it once was. Many Americans have long believed and hoped that that was the norm, and we will naturally return to it without much effort on our part. But as the past 40 years have gone by, it appears more and more that Gilded Age brutality is the U.S. norm, and the years of an American middle class were a brief exception. That means recreating it will require the same titanic struggle needed to create it in the first place.
PATCO was dominated by Vietnam War-era veterans who’d learned air traffic control in the military and were one of a vanishingly small number of unions to endorse Reagan in 1980, thereby scoring one of the greatest own goals in political history. It’s easy to imagine strikers expressing the same sentiments as a Trump voter who famously lamented, “I thought he was going to do good things. He’s not hurting the people he needs to be hurting.”
The PATCO saga began in February 1981, when negotiations began between the union and the FAA on a new contract. PATCO proposed changes including a 32-hour workweek and a big increase in pay. The FAA came back with counterproposals the union deemed insufficient, and on August 3, with bargaining at an impasse, most of the air traffic controllers walked out.
It was unquestionably illegal for PATCO, as a union of government workers, to strike. However, which laws are enforced is always and everywhere a political decision: Wall Street firms broke countless laws in the run-up to the 2008 financial crisis, yet almost no executives suffered any consequences. Reagan & Co. wanted to send a message that mere workers could expect no such forbearance. Just two days after the strike began, the air traffic controllers were gone.
The significance of Reagan’s actions is rarely discussed today in the mainstream, and for understandable reasons: It was the first huge offensive in a war that corporate America has been waging on this country’s middle class ever since. As Warren Buffett — current estimated net worth $101 billion — has said, “There’s class warfare, all right, but it’s my class, the rich class, that’s making war, and we’re winning.”
The stunning victory of the wealthy over everyone else can been measured in several straightforward ways. During a speech last May at a community college in Cleveland, Joe Biden explained one of them:
From 1948 after the war to 1979, productivity in America grew by 100 percent. We made more things with productivity. You know what the workers’ pay grew? By 100 percent. Since 1979, all of that changed. Productivity has grown four times faster than pay has grown. The basic bargain in this country has been broken.
Productivity is a simple but extremely important economic concept. Over time, as technology advances and society learns how to use it, each worker can produce more. One person with a bulldozer can move a lot more dirt than one person with a shovel. One person with the latest version of Microsoft Excel can do a lot more math than one person with Napier’s bones.
The meaning of Biden’s statistics is that for decades after World War II, America got much richer overall, and average worker pay went up at the same rate. Then the link between productivity and pay was severed: The U.S. overall continued to get much richer, but most of the increased wealth went to the top, not to normal people. Corporate CEOs, partners at corporate law firms, orthopedic surgeons — they make three, five, 10 times what they did in 1981. Nurses, firefighters, janitors, almost anyone without a college degree — their pay has barely budged.
The situation is especially egregious at the bottom of the pay scale. Until 1968, Congress increased the federal minimum wage in line with productivity. That year, it reached its highest level: Adjusted for inflation, it was the equivalent of $12 per hour today. It has since fallen to $7.25. Yet the whole story is far worse. Even as low-wage workers have battled fruitlessly to get the federal minimum wage raised to $15, no one realizes that if it had continued increasing along with productivity since 1968, it would now be over $24 per hour. At that level, a couple working full-time minimum wage jobs would take home $96,000 a year. This seems incredible, yet there are no economic reasons it couldn’t happen; we have simply made a political decision that it should not.
Another way to understand this is to look at the other end of American society. In 1995, Bill Gates had a net worth of $10 billion, worth about $18 billion in today’s dollars. That was enough to make him the richest person in America. If that were all Gates had today, there would be 25 or so billionaires ahead of him in line. Jeff Bezos, currently in first place, possesses 10 times Gates’s 1995 net worth.
Then there’s the number of significant strikes in the U.S. each year. A confident, powerful labor movement will generate large numbers of strikes; one terrorized and cowed into submission will not. According to the Labor Department, there were generally 200-400 large-scale strikes each year from 1947 to 1979. There were 187 in 1980. Then after the PATCO firing, the numbers fell off a cliff. In 1988, the last full year of Reagan’s second term, there were just 40 strikes. By 2017, there were seven.
The direct causal relationship between the firing of the air traffic controllers and the crushing of labor is widely noted and celebrated on the right. In a 2003 speech at the Reagan Library in California, then-Chair of the Federal Reserve Alan Greenspan spoke glowingly of the “flexibility” of U.S. labor markets, by which he meant “the freedom to fire.” Greenspan said that “perhaps the most important” contribution to these flexible markets “was the firing of the air traffic controllers in August 1981. … [Reagan’s] action gave weight to the legal right of private employers, previously not fully exercised, to use their own discretion to both hire and discharge workers.”
Donald Devine, the head of Reagan’s Office of Personnel Management at the time, later wrote, “American business leaders were given a lesson in managerial leadership [by Reagan] that they could not and did not ignore. Many private sector executives have told me that they were able to cut the fat from their organizations and adopt more competitive work practices because of what the government did in those days.”
The question today is whether the U.S. will ever go back to being the middle-class society it once was. Many Americans have long believed and hoped that that was the norm, and we will naturally return to it without much effort on our part. But as the past 40 years have gone by, it appears more and more that Gilded Age brutality is the U.S. norm, and the years of an American middle class were a brief exception. That means recreating it will require the same titanic struggle needed to create it in the first place.
the bipartisan conjob!!!
BIPARTISAN INFRASTRUCTURE BILL INCLUDES $25 BILLION IN POTENTIAL NEW SUBSIDIES FOR FOSSIL FUELS
Instead of reducing the role of fossil fuels in the economy, critics say, the bill subsidizes industry “greenwashing.”
Alleen Brown - the intercept
August 3 2021, 4:00 a.m.
THE SENATE’S NEW bipartisan infrastructure bill is being sold as a down payment on addressing the climate crisis. But environmental advocates and academics are warning the proposed spending bill is full of new fossil fuel industry subsidies masked as climate solutions. The latest draft bill would make fossil fuel companies eligible for at least $25 billion in new subsidies, according to an analysis by the Center for International Environmental Law.
“This is billions upon billions of dollars in additional fossil fuel industry subsidies in addition to the $15 billion that we already hand out to this industry to support and fund this industry,” said Jim Walsh, Food and Water Watch’s senior policy analyst. Scientists say that to meet the goals of the international Paris climate accord, the U.S would need to reach net-zero emissions by 2050 — and be well on the way there by 2030. With subsidies that keep fossil fuel industries going, Walsh said, “We will never be able to meet the Paris agreement if we fund these kind of programs.”
Just as concerning is the new economy the subsidies could entrench, said Walsh, through the creation of new fossil fuel infrastructure. “This would support the development of four petrochemical hubs that would create profit incentives for greenhouse gas emission production and would be focused on finding new ways of integrating fossil fuels into our economy for transportation, energy, petrochemical development, and plastics.”
In short, he added, “This deal envisions a world where we will use fossil fuels into perpetuity.”
Industry-Backed “Climate” Projects
The subsidies would go toward technologies sold as dream fixes for ending the nightmare of the climate crisis without the colossal political hurdle of dislodging the fossil fuel industry from the U.S. economy. Such technologies include carbon capture and decarbonized hydrogen fuel. Both purported solutions in practice help fossil fuel companies mask the continued release of climate-warming gases. Neither of the technologies are currently commercially viable at a large scale, so the energy industry requires government help to carry out what critics see as a public relations scheme.
The bill includes billions of dollars for carbon capture, utilization, and storage; hydrogen fuel made from natural gas; and “low emissions buses” that could run on fuels including hydrogen and natural gas. It also encourages subsidies that go unquantified in the legislation, for example urging states to waive property taxes for pipelines to transport captured carbon.
The devil is in the details. The vast majority of clean-sounding hydrogen is made from natural gas and produces the greenhouse gas carbon dioxide as a waste product. The process itself requires energy, typically supplied by burning more natural gas, which also produces greenhouse gases. Meanwhile, carbon capture and storage are promoted primarily as a means to clean up continued emissions from fossil fuel processing facilities. Carbon capture would do nothing to resolve the array of severe environmental problems caused upstream by drilling, fracking, and mining — let alone the downstream burning of the fuels for energy.
The survival of the fossil fuel industry depends on its ability to convince the public that corporations are taking steps to address the climate crisis. Hydrogen and carbon capture, utilization, and storage have been two of the industry’s key strategies for achieving that goal. Exxon Mobil, Royal Dutch Shell, and Chevron, just to name a few, have touted their investments in hydrogen and carbon capture.
While long-shot, industry-supported “climate” projects depend on government subsidies, so does the rapid scale-up of renewable energy sources already proven to meaningfully slow down the spiraling climate crisis. Put simply, wind and solar work as climate fixes right now, while carbon capture and “decarbonized” hydrogen do not.
Yet the Democrats and Republicans pushing the infrastructure compromise are choosing to give the fossil fuel industry a lifeline instead of providing funding for proven renewable energy technology. Even bill provisions that facilitate renewable energy development contain language that could allow funds to go instead to fossil fuel industry “solutions.”
“Any legislation funding carbon capture and storage or use or direct air capture is legalizing the funding of scam technologies that merely increase air pollution death and illness, mining and its damage, and fossil-fuel infrastructure, and they have no provable carbon benefit,” said Mark Jacobson, a professor of civil and environmental engineering at Stanford University. “By far, the best thing to do with the subsidy money for this is to purchase wind, solar, and storage to eliminate fossil fuels.”
Little-Understood Technologies
Senate Majority Leader Chuck Schumer, D-N.Y., hopes to finalize the latest $550 billion bipartisan iteration of the infrastructure bill by the end of the week. The legislation will also have to make it through the House and will ultimately be complimented by hundreds of billions in additional provisions to be hammered out though a separate process called reconciliation, which requires no Republican support.
President Joe Biden kicked off the process with his own blueprint, the $2.5 trillion American Jobs Plan. Republicans, however, didn’t come up with the carbon capture and hydrogen spending: Many of the industry-friendly proposals were part of Biden’s plan from the start. “It’s truly bipartisan, which makes me cringe,” said Walsh.
The bill is moving fast, and the billions in funding are set to become law at a time when policymakers and the public still lack a firm grasp on how the technologies work.
Hydrogen has become the latest darling of the fossil fuel industry. So-called clean or “blue” hydrogen would use carbon capture and storage to neutralize the greenhouse gas emissions associated with the process. Another type of the fuel, called “green” hydrogen, uses electricity drawn from renewables.
Neither “blue” nor “green” means of hydrogen production, however, are widely used. For instance, only two facilities in the world have tried to commercially produce decarbonized “blue” hydrogen. As a result, 96 percent of hydrogen fuel globally comes from carbon-intensive means of production, according to a 2019 report. Research out of Stanford and Cornell Universities indicates hydrogen produces more climate-warming gases than simply using natural gas directly.
The infrastructure bill calls for a national strategy to put “clean hydrogen” into action, including four regional hydrogen hubs. The provision explicitly ties one hub to fossil fuels and calls for two others to be near natural gas resources.
Likewise, the carbon capture measure in the bill ties government investment to areas “with high levels of coal, oil, or natural gas resources.”
Existing carbon capture projects have repeatedly run into problems, including a heavily subsidized Chevron facility dubbed the largest carbon capture project in the world, which was attached to a liquid natural gas export facility in Australia and recently deemed a technological failure. Exacerbating the problem is that there is no real market for captured carbon — except to use captured gases to produce even more oil from old wells. While the legislation puts money toward creating new uses for the trapped gases, large-scale markets are a far-off prospect.
Some proponents argue that carbon capture and hydrogen fuels could ultimately be beneficial for the climate if used for narrow purposes, like capturing carbon from steel production. But there is nothing in the bill preventing the fossil fuel industry from using the purportedly climate-friendly technologies to shore up its image while continuing to release emissions — a tactic known as “greenwashing.”
Environmental justice groups are clear about where they stand. Biden’s Environmental Justice Advisory Council issued a report in May that included carbon capture and storage among a list of technologies that will not benefit communities. Separately, a group of hundreds of organizations, ranging from Ben & Jerry’s to 350.org, sent a letter to Democratic leaders on July 19 urging them to resist energy strategies reliant on carbon capture, utilization, and storage.
The letter reads, “Investing in carbon capture delays the needed transition away from fossil fuels and other combustible energy sources, and poses significant new environmental, health, and safety risks, particularly to Black, Brown, and Indigenous communities already overburdened by industrial pollution, dispossession, and the impacts of climate change.”
“This is billions upon billions of dollars in additional fossil fuel industry subsidies in addition to the $15 billion that we already hand out to this industry to support and fund this industry,” said Jim Walsh, Food and Water Watch’s senior policy analyst. Scientists say that to meet the goals of the international Paris climate accord, the U.S would need to reach net-zero emissions by 2050 — and be well on the way there by 2030. With subsidies that keep fossil fuel industries going, Walsh said, “We will never be able to meet the Paris agreement if we fund these kind of programs.”
Just as concerning is the new economy the subsidies could entrench, said Walsh, through the creation of new fossil fuel infrastructure. “This would support the development of four petrochemical hubs that would create profit incentives for greenhouse gas emission production and would be focused on finding new ways of integrating fossil fuels into our economy for transportation, energy, petrochemical development, and plastics.”
In short, he added, “This deal envisions a world where we will use fossil fuels into perpetuity.”
Industry-Backed “Climate” Projects
The subsidies would go toward technologies sold as dream fixes for ending the nightmare of the climate crisis without the colossal political hurdle of dislodging the fossil fuel industry from the U.S. economy. Such technologies include carbon capture and decarbonized hydrogen fuel. Both purported solutions in practice help fossil fuel companies mask the continued release of climate-warming gases. Neither of the technologies are currently commercially viable at a large scale, so the energy industry requires government help to carry out what critics see as a public relations scheme.
The bill includes billions of dollars for carbon capture, utilization, and storage; hydrogen fuel made from natural gas; and “low emissions buses” that could run on fuels including hydrogen and natural gas. It also encourages subsidies that go unquantified in the legislation, for example urging states to waive property taxes for pipelines to transport captured carbon.
The devil is in the details. The vast majority of clean-sounding hydrogen is made from natural gas and produces the greenhouse gas carbon dioxide as a waste product. The process itself requires energy, typically supplied by burning more natural gas, which also produces greenhouse gases. Meanwhile, carbon capture and storage are promoted primarily as a means to clean up continued emissions from fossil fuel processing facilities. Carbon capture would do nothing to resolve the array of severe environmental problems caused upstream by drilling, fracking, and mining — let alone the downstream burning of the fuels for energy.
The survival of the fossil fuel industry depends on its ability to convince the public that corporations are taking steps to address the climate crisis. Hydrogen and carbon capture, utilization, and storage have been two of the industry’s key strategies for achieving that goal. Exxon Mobil, Royal Dutch Shell, and Chevron, just to name a few, have touted their investments in hydrogen and carbon capture.
While long-shot, industry-supported “climate” projects depend on government subsidies, so does the rapid scale-up of renewable energy sources already proven to meaningfully slow down the spiraling climate crisis. Put simply, wind and solar work as climate fixes right now, while carbon capture and “decarbonized” hydrogen do not.
Yet the Democrats and Republicans pushing the infrastructure compromise are choosing to give the fossil fuel industry a lifeline instead of providing funding for proven renewable energy technology. Even bill provisions that facilitate renewable energy development contain language that could allow funds to go instead to fossil fuel industry “solutions.”
“Any legislation funding carbon capture and storage or use or direct air capture is legalizing the funding of scam technologies that merely increase air pollution death and illness, mining and its damage, and fossil-fuel infrastructure, and they have no provable carbon benefit,” said Mark Jacobson, a professor of civil and environmental engineering at Stanford University. “By far, the best thing to do with the subsidy money for this is to purchase wind, solar, and storage to eliminate fossil fuels.”
Little-Understood Technologies
Senate Majority Leader Chuck Schumer, D-N.Y., hopes to finalize the latest $550 billion bipartisan iteration of the infrastructure bill by the end of the week. The legislation will also have to make it through the House and will ultimately be complimented by hundreds of billions in additional provisions to be hammered out though a separate process called reconciliation, which requires no Republican support.
President Joe Biden kicked off the process with his own blueprint, the $2.5 trillion American Jobs Plan. Republicans, however, didn’t come up with the carbon capture and hydrogen spending: Many of the industry-friendly proposals were part of Biden’s plan from the start. “It’s truly bipartisan, which makes me cringe,” said Walsh.
The bill is moving fast, and the billions in funding are set to become law at a time when policymakers and the public still lack a firm grasp on how the technologies work.
Hydrogen has become the latest darling of the fossil fuel industry. So-called clean or “blue” hydrogen would use carbon capture and storage to neutralize the greenhouse gas emissions associated with the process. Another type of the fuel, called “green” hydrogen, uses electricity drawn from renewables.
Neither “blue” nor “green” means of hydrogen production, however, are widely used. For instance, only two facilities in the world have tried to commercially produce decarbonized “blue” hydrogen. As a result, 96 percent of hydrogen fuel globally comes from carbon-intensive means of production, according to a 2019 report. Research out of Stanford and Cornell Universities indicates hydrogen produces more climate-warming gases than simply using natural gas directly.
The infrastructure bill calls for a national strategy to put “clean hydrogen” into action, including four regional hydrogen hubs. The provision explicitly ties one hub to fossil fuels and calls for two others to be near natural gas resources.
Likewise, the carbon capture measure in the bill ties government investment to areas “with high levels of coal, oil, or natural gas resources.”
Existing carbon capture projects have repeatedly run into problems, including a heavily subsidized Chevron facility dubbed the largest carbon capture project in the world, which was attached to a liquid natural gas export facility in Australia and recently deemed a technological failure. Exacerbating the problem is that there is no real market for captured carbon — except to use captured gases to produce even more oil from old wells. While the legislation puts money toward creating new uses for the trapped gases, large-scale markets are a far-off prospect.
Some proponents argue that carbon capture and hydrogen fuels could ultimately be beneficial for the climate if used for narrow purposes, like capturing carbon from steel production. But there is nothing in the bill preventing the fossil fuel industry from using the purportedly climate-friendly technologies to shore up its image while continuing to release emissions — a tactic known as “greenwashing.”
Environmental justice groups are clear about where they stand. Biden’s Environmental Justice Advisory Council issued a report in May that included carbon capture and storage among a list of technologies that will not benefit communities. Separately, a group of hundreds of organizations, ranging from Ben & Jerry’s to 350.org, sent a letter to Democratic leaders on July 19 urging them to resist energy strategies reliant on carbon capture, utilization, and storage.
The letter reads, “Investing in carbon capture delays the needed transition away from fossil fuels and other combustible energy sources, and poses significant new environmental, health, and safety risks, particularly to Black, Brown, and Indigenous communities already overburdened by industrial pollution, dispossession, and the impacts of climate change.”
There's A Stark Red-Blue Divide When It Comes To States' Vaccination Rates
June 9, 20217:00 AM ET
DOMENICO MONTANARO - NPR
Less than a month remains until the Fourth of July, which was President Biden's goal for 70% of American adults to have gotten at least one dose of a COVID-19 vaccine.
It looks like it's going to be a stretch to get there.
As of Tuesday, nearly 64% of U.S. adults have had at least one shot, according to data from the Centers for Disease Control and Prevention.
The key issue is that demand has dropped off. After an initial crush, the number of doses being administered daily is on a steep decline from the early April peak.
So what's going on? A few things to note:
But it's not just about politics:
It looks like it's going to be a stretch to get there.
As of Tuesday, nearly 64% of U.S. adults have had at least one shot, according to data from the Centers for Disease Control and Prevention.
The key issue is that demand has dropped off. After an initial crush, the number of doses being administered daily is on a steep decline from the early April peak.
So what's going on? A few things to note:
- There's a huge political divide. Speaking over the weekend, former President Donald Trump took credit for the vaccine rollout and told a North Carolina crowd of supporters that "most of you" have likely been vaccinated.
- But surveys have shown Trump supporters are the least likely to say they have been vaccinated or plan to be. Remember, Trump got vaccinated before leaving the White House, but that was reported months later. Unlike other public officials who were trying to encourage people to get the shot, Trump did it in private.
- The top 22 states (including D.C.) with the highest adult vaccination rates all went to Joe Biden in the 2020 presidential election.
- Some of the least vaccinated states are the most pro-Trump. Trump won 17 of the 18 states with the lowest adult vaccination rates. Many of these states have high proportions of whites without college degrees.
But it's not just about politics:
- Black Americans, who vote overwhelmingly Democratic, aren't getting the vaccine at the rate of whites. Less than a quarter of Black Americans had gotten at lease one vaccine dose as of Tuesday, according to the CDC. It's the lowest of any racial or ethnic group listed.
- Black Americans also make up a significant percentage of the population in places like Alabama, Mississippi, Louisiana, Tennessee, Arkansas, South Carolina and Georgia. Those are seven of the 10 states with the lowest adult vaccination rates, though the gathering of data by race and ethnicity has been spotty depending on the state.
- Young people, who also lean heavily toward Democrats, are also less likely to get vaccinated. More than 80% of people over 65 have gotten at least one shot, compared with just 45% of 18- to 24-year-olds and 51% of those 25 to 39.
- And it's not necessarily about hesitancy. The May NPR/PBS NewsHour/Marist poll found 75% of Black adults said they had gotten a shot or would get it when one came available. That was about the same as white adults, but Black adults trailed whites when it came to those who said they'd actually received one.
- Equitable distribution of the vaccines has been a focus of the Biden White House, and they can't be happy with the lag.
Less-Educated Workers Were Hardest Hit in Recession. That Hasn’t Changed.
BY Dean Baker, Center for Economic and Policy Research - truthout
PUBLISHED May 7, 2021
The April employment report was considerably weaker than had generally been expected, with the economy adding just 266,000 jobs. Furthermore, the prior two months numbers were revised down by 78,000. The unemployment rate edged up to 6.1 percent, but this was entirely due to more people entering the labor force. The employment-to-population ratio (EPOP) also edged up by 0.1 percentage point to 57.9 percent. That is still 2.9 percentage points below its average for 2019.
Performance Across Sectors Was Very Mixed
The leisure and hospitality sector accounted for more than all the gains in April, adding 331,000 jobs. Restaurants added 187,000; arts and entertainment added 89,600; and hotels added 54,400. State and local government added a surprisingly low 39,000 jobs, almost all in education. Employment in state and local governments is still 1,278,000 below the pre-pandemic level. There should be large employment increases here as more schools reopen in May.
Several sectors were big job losers. Manufacturing lost 18,000 jobs, which was entirely attributable to a loss of 27,000 jobs in the car industry. This was due to shutdowns caused by a shortage of semiconductors.
There was a loss of 77,400 jobs in the courier industry and 111,400 in the temp sector. It’s not clear whether these declines reflect demand or supply conditions. These tend to be lower paying jobs, so workers may have better alternatives. On the other hand, as people feel more comfortable going out after being vaccinated there may be less demand for couriers.
There was also a loss of 49,400 jobs in food stores, which could reflect reduced demand as people increasingly are going to restaurants. Employment in the sector is still almost 40,000 higher than the pre-pandemic level. Nursing care facilities lost 18,800 jobs (1.3 percent of employment). These also tend to be low-paying jobs, so this could reflect supply conditions.
Construction showed no change in employment in April. This could just be a timing fluke, the sector was reported as adding 97,000 jobs in March, and there is plenty of evidence that the sector is booming.
Some Evidence of Labor Shortages in Low-Paying Sectors
If employers are having trouble finding workers, as many claim, then we should expect to see more rapid wage growth and an increase in the length of the workweek, as employers try to work their existing workforce more hours. We do see some evidence of both.
The annual rate of wage growth comparing the last three months (February, March, and April) with the prior three months, was 3.7 percent for production and nonsupervisory workers overall, 4.1 percent for retail, and 17.6 percent for leisure and hospitality. These data are erratic, but they do indicate some acceleration in wage growth, especially for hotels and restaurants.
There is also some evidence for an increasing length of the workweek, which is consistent with employers having trouble getting workers. For production and nonsupervisory workers overall, weekly hours are up 0.8 hours from the 2019 average. It is the same for retail, and 0.6 hours for leisure and hospitality.
Recovery Is Benefiting More Educated Workers
Less-educated workers were hardest hit in the recession, but many of us hoped that the situation would even out as the recovery progressed. This has not yet happened.
The unemployment rate for college grads fell to 3.5 percent in April, compared with 4.0 percent in January. For high school grads, the drop over this period was from 7.1 percent to 6.9 percent. The EPOP for high school grads is now 4.3 percentage points below its 2019 average, for college grads the EPOP is down by just 2.6 percentage points.
Involuntary Part-Time Work Falls Sharply
There was a sharp fall in involuntary part-time work of 583,000. The current level is roughly equal to 2017 levels. By contrast, voluntary part-time is still 2,400,000 below its 2019 average, a drop of almost 12 percent. This reflects the loss of jobs in restaurants and hotels, many of which are part-time.
Unemployment Due to Voluntary Quits Remains Low
The share of unemployment due to voluntary quits edges up only slightly to 8.3 percent, still lower than at any point in the last quarter century, excepting the Great Recession. This measure is usually seen as a sign of workers confidence in their labor market prospects.
It is also worth noting that the share of long-term unemployment remains extraordinarily high, although it did edge down slightly from 43.4 percent to 43.0 percent. The all-time high for this measure was 45.2 percent in the Great Recession.
Mixed Report — Economy Is Moving in the Right Direction, but Slowly
In more normal times job growth of 266,000 would be seen as very strong, but not when the economy is down more than 8 million jobs. The job loss in some sectors may prove to be anomalies, and some of the slow growth is almost certainly just a question of timing, as with state and local government employment. It is possible that the unemployment insurance supplements are having some disincentive effect, but if so, that will quickly dwindle as they end in September.
Even with the weak job growth reported for April, the average growth for the last three months is still 524,000. If we can maintain that pace, the labor market will be looking pretty good by the end of the year.
Performance Across Sectors Was Very Mixed
The leisure and hospitality sector accounted for more than all the gains in April, adding 331,000 jobs. Restaurants added 187,000; arts and entertainment added 89,600; and hotels added 54,400. State and local government added a surprisingly low 39,000 jobs, almost all in education. Employment in state and local governments is still 1,278,000 below the pre-pandemic level. There should be large employment increases here as more schools reopen in May.
Several sectors were big job losers. Manufacturing lost 18,000 jobs, which was entirely attributable to a loss of 27,000 jobs in the car industry. This was due to shutdowns caused by a shortage of semiconductors.
There was a loss of 77,400 jobs in the courier industry and 111,400 in the temp sector. It’s not clear whether these declines reflect demand or supply conditions. These tend to be lower paying jobs, so workers may have better alternatives. On the other hand, as people feel more comfortable going out after being vaccinated there may be less demand for couriers.
There was also a loss of 49,400 jobs in food stores, which could reflect reduced demand as people increasingly are going to restaurants. Employment in the sector is still almost 40,000 higher than the pre-pandemic level. Nursing care facilities lost 18,800 jobs (1.3 percent of employment). These also tend to be low-paying jobs, so this could reflect supply conditions.
Construction showed no change in employment in April. This could just be a timing fluke, the sector was reported as adding 97,000 jobs in March, and there is plenty of evidence that the sector is booming.
Some Evidence of Labor Shortages in Low-Paying Sectors
If employers are having trouble finding workers, as many claim, then we should expect to see more rapid wage growth and an increase in the length of the workweek, as employers try to work their existing workforce more hours. We do see some evidence of both.
The annual rate of wage growth comparing the last three months (February, March, and April) with the prior three months, was 3.7 percent for production and nonsupervisory workers overall, 4.1 percent for retail, and 17.6 percent for leisure and hospitality. These data are erratic, but they do indicate some acceleration in wage growth, especially for hotels and restaurants.
There is also some evidence for an increasing length of the workweek, which is consistent with employers having trouble getting workers. For production and nonsupervisory workers overall, weekly hours are up 0.8 hours from the 2019 average. It is the same for retail, and 0.6 hours for leisure and hospitality.
Recovery Is Benefiting More Educated Workers
Less-educated workers were hardest hit in the recession, but many of us hoped that the situation would even out as the recovery progressed. This has not yet happened.
The unemployment rate for college grads fell to 3.5 percent in April, compared with 4.0 percent in January. For high school grads, the drop over this period was from 7.1 percent to 6.9 percent. The EPOP for high school grads is now 4.3 percentage points below its 2019 average, for college grads the EPOP is down by just 2.6 percentage points.
Involuntary Part-Time Work Falls Sharply
There was a sharp fall in involuntary part-time work of 583,000. The current level is roughly equal to 2017 levels. By contrast, voluntary part-time is still 2,400,000 below its 2019 average, a drop of almost 12 percent. This reflects the loss of jobs in restaurants and hotels, many of which are part-time.
Unemployment Due to Voluntary Quits Remains Low
The share of unemployment due to voluntary quits edges up only slightly to 8.3 percent, still lower than at any point in the last quarter century, excepting the Great Recession. This measure is usually seen as a sign of workers confidence in their labor market prospects.
It is also worth noting that the share of long-term unemployment remains extraordinarily high, although it did edge down slightly from 43.4 percent to 43.0 percent. The all-time high for this measure was 45.2 percent in the Great Recession.
Mixed Report — Economy Is Moving in the Right Direction, but Slowly
In more normal times job growth of 266,000 would be seen as very strong, but not when the economy is down more than 8 million jobs. The job loss in some sectors may prove to be anomalies, and some of the slow growth is almost certainly just a question of timing, as with state and local government employment. It is possible that the unemployment insurance supplements are having some disincentive effect, but if so, that will quickly dwindle as they end in September.
Even with the weak job growth reported for April, the average growth for the last three months is still 524,000. If we can maintain that pace, the labor market will be looking pretty good by the end of the year.
fools in charge, wasting your money!!!
Even by Pentagon terms, this was a dud: The disastrous saga of the F-35
The military-industrial complex spent $2 trillion building a "flying Swiss Army knife." Now it's been shelved
By LUCIAN K. TRUSCOTT IV - salon
FEBRUARY 27, 2021 1:00PM (UTC)
Somehow the United States has managed to develop a fighter jet for all three services — the Air Force, Navy and Marines — that goes for $100 million apiece, ran up almost a half-trillion dollars in total development costs, will cost almost $2 trillion over the life of the plane, and yet it can't be flown safely.
How did this happen, you ask? Well, it's a long, complicated story, but basically it involves taking something that's supposed to do one thing and do it well, like take off from the ground and fly really fast, and adding stuff like being able to take off and land on an aircraft carrier or hover like a hummingbird.
That's why they call it the "flying Swiss Army knife." Have you ever tried to use one of the things? First of all, you can't find the knife blade, hidden as it is among scissors and screwdrivers and can openers and nose hair tweezers and nail files and pliers. The geniuses at the Pentagon decided they needed to replace the aging F-16 fighter, and everybody wanted in on it.
The F-16 is what you would call the M1A1 airplane of U.S. forces. The Air Force currently has about 1,250 of the planes, with 700 of those in the active duty Air Force, about 700 in the Air National Guard, and 50 in the Reserves. General Dynamics has built about 4,600 of them since the plane became operational in the mid-1970s, and they are used by allied air forces all over the world. You fill them up with jet fuel, push the starter button and take off. It will fly at twice the speed of sound, it will carry 15 different bombs, including two nuclear weapons, it can shoot down enemy aircraft with five different varieties of air-to-air missiles, it can knock out ground targets with four different air-to-ground missiles, and it can carry two kinds of anti-ship missiles. The thing is an all-around killing machine.
The F-35, on the other hand, can't fly at twice the speed of sound. In fact, it comes with what amounts to a warning label on its control panel marking supersonic flight as "for emergency use only." So it's OK to fly the thing like a 737, but if you want to go really fast, you have to ask permission, which promises to work really, really well in a dogfight. What are pilots going to do if they're being pursued by a supersonic enemy jet?
The F-35 will carry four different air-to-air missiles, six air-to-ground missiles and one anti-ship missile, but the problem is, all of them have to be fired from the air, and right now, the F-35 isn't yet "operational," which means, essentially, that it's so unsafe to fly the damn things, they spend most of their time parked.
Take the problem they have with switches. The developers of the F-35 decided to go with touchscreen switches rather than the physical ones used in other fighters, like toggles or rocker switches. That would be nice if they worked, but pilots report that the touchscreen switches don't function 20 percent of the time. So you're flying along, and you want to drop your landing gear to land, but your touchscreen decides "not this time, pal" and refuses to work. How would you like to be driving your car and have your brakes decide not to work 20 percent of the time, like, say, when you're approaching a red light at a major intersection?
But it gets worse. The heat coating on the engine's rotor blades is failing at a rate that leaves 5 to 6 percent of the F-35 fleet parked on the tarmac at any given time, awaiting not just engine repairs, but total replacement. Then there's the canopy. You know what a canopy is, don't you? It's the clear bubble pilots look through so they can see to take off and land, not to mention see other aircraft, such as enemy aircraft. Well, it seems F-35 canopies have decided to "delaminate" at inappropriate times, making flying the things dangerous if not impossible. So many of them have failed that the Pentagon has had to fund an entirely new canopy manufacturer to make replacements.
There's also the problem with the plane's "stealth" capability, which is compromised if you fly the thing too fast, because the coating that makes the plane invisible to radar has a bad habit of peeling off, making the planes completely visible to enemy radar.
But fear not, Air Force Chief of Staff Gen. Charles Q. Brown Jr. has come up with a solution. He announced last week that henceforth, the Pentagon is going to treat the F-35 as the "Ferrari" of the U.S. combat air fleet. "You don't drive your Ferrari to work every day, you only drive it on Sundays. This is our 'high end' fighter, we want to make sure we don't use it all for the low-end fight," he said in a press conference on Feb. 17.
Got it. If an enemy decides to start a war on a Tuesday or Wednesday, we'll just "drive" our aging F-16's, so our precious F-35s can be left in the garage waiting for good weather on Sunday. I'm sure we can get everyone to sign up for the "we'll only go to war on Sunday" treaty.
The F-35 can be understood best as a na-na-na-na-na problem. Originally developed for the Air Force, the minute the thing was on the drafting table, the Navy and Marines started crying, "Hey, what about us?" To quiet the jealous fit being thrown by the other services, the Pentagon agreed to turn the thing into the "Swiss Army knife" it has become.
A variant capable of taking off from and landing on carriers was promised to the Navy, with bigger wings and a tail hook. Except the tail hook refused to work for the first two years it was tested, meaning that every carrier landing had to take place in sight of land so the Navy F-35 could fly over to the coast and land safely on a runway.
The Marine variety had to be capable of vertical takeoff and landing, because the Navy was jealous of its carriers and would only agree to allow the Marines to have mini-carriers with landing surfaces big enough for vertical use. That meant the Marine version had to be redesigned so it had a big flap under the engine to divert thrust so the thing could land on Marine ships. This meant the Marine version had added weight and space that would otherwise be used to carry weapons.
So you're a Marine, and you're flying along in your F-35 and an enemy comes along and starts shooting at you, and you shoot back and miss, but you don't have another missile, because where that missile should be is where your damn vertical landing flap is.
Maybe they should just issue F-35 pilots a bunch of flags to use when they take to the air, and then they'd be ready for anything. Tail starts coming off because you went supersonic for too long? Fly your NO FAIR flag. Cockpit delaminating? Grab your JUST A MINUTE I can't see you flag. Engine rotor blades burning up? That would be the OOOPS can't dogfight right now, I'm waiting on a replacement engine flag.
Not to worry, pilots, the Pentagon is on the problem and they have a solution. Brown says they're going back to the drawing board for a "fifth generation-minus" fighter jet, meaning they want to come up with something that looks like and flies like and has the combat capabilities of the good old F-16. Only problem is, if you use the F-35 project as a benchmark, it will be two decades before the "minus" jet is operational. Until then, guys, have fun watching your F-35's gather dust on the tarmac while you continue to fly your F-16's, which will be older than the average pilot's grandfather by the time the new plane is ready.
How did this happen, you ask? Well, it's a long, complicated story, but basically it involves taking something that's supposed to do one thing and do it well, like take off from the ground and fly really fast, and adding stuff like being able to take off and land on an aircraft carrier or hover like a hummingbird.
That's why they call it the "flying Swiss Army knife." Have you ever tried to use one of the things? First of all, you can't find the knife blade, hidden as it is among scissors and screwdrivers and can openers and nose hair tweezers and nail files and pliers. The geniuses at the Pentagon decided they needed to replace the aging F-16 fighter, and everybody wanted in on it.
The F-16 is what you would call the M1A1 airplane of U.S. forces. The Air Force currently has about 1,250 of the planes, with 700 of those in the active duty Air Force, about 700 in the Air National Guard, and 50 in the Reserves. General Dynamics has built about 4,600 of them since the plane became operational in the mid-1970s, and they are used by allied air forces all over the world. You fill them up with jet fuel, push the starter button and take off. It will fly at twice the speed of sound, it will carry 15 different bombs, including two nuclear weapons, it can shoot down enemy aircraft with five different varieties of air-to-air missiles, it can knock out ground targets with four different air-to-ground missiles, and it can carry two kinds of anti-ship missiles. The thing is an all-around killing machine.
The F-35, on the other hand, can't fly at twice the speed of sound. In fact, it comes with what amounts to a warning label on its control panel marking supersonic flight as "for emergency use only." So it's OK to fly the thing like a 737, but if you want to go really fast, you have to ask permission, which promises to work really, really well in a dogfight. What are pilots going to do if they're being pursued by a supersonic enemy jet?
The F-35 will carry four different air-to-air missiles, six air-to-ground missiles and one anti-ship missile, but the problem is, all of them have to be fired from the air, and right now, the F-35 isn't yet "operational," which means, essentially, that it's so unsafe to fly the damn things, they spend most of their time parked.
Take the problem they have with switches. The developers of the F-35 decided to go with touchscreen switches rather than the physical ones used in other fighters, like toggles or rocker switches. That would be nice if they worked, but pilots report that the touchscreen switches don't function 20 percent of the time. So you're flying along, and you want to drop your landing gear to land, but your touchscreen decides "not this time, pal" and refuses to work. How would you like to be driving your car and have your brakes decide not to work 20 percent of the time, like, say, when you're approaching a red light at a major intersection?
But it gets worse. The heat coating on the engine's rotor blades is failing at a rate that leaves 5 to 6 percent of the F-35 fleet parked on the tarmac at any given time, awaiting not just engine repairs, but total replacement. Then there's the canopy. You know what a canopy is, don't you? It's the clear bubble pilots look through so they can see to take off and land, not to mention see other aircraft, such as enemy aircraft. Well, it seems F-35 canopies have decided to "delaminate" at inappropriate times, making flying the things dangerous if not impossible. So many of them have failed that the Pentagon has had to fund an entirely new canopy manufacturer to make replacements.
There's also the problem with the plane's "stealth" capability, which is compromised if you fly the thing too fast, because the coating that makes the plane invisible to radar has a bad habit of peeling off, making the planes completely visible to enemy radar.
But fear not, Air Force Chief of Staff Gen. Charles Q. Brown Jr. has come up with a solution. He announced last week that henceforth, the Pentagon is going to treat the F-35 as the "Ferrari" of the U.S. combat air fleet. "You don't drive your Ferrari to work every day, you only drive it on Sundays. This is our 'high end' fighter, we want to make sure we don't use it all for the low-end fight," he said in a press conference on Feb. 17.
Got it. If an enemy decides to start a war on a Tuesday or Wednesday, we'll just "drive" our aging F-16's, so our precious F-35s can be left in the garage waiting for good weather on Sunday. I'm sure we can get everyone to sign up for the "we'll only go to war on Sunday" treaty.
The F-35 can be understood best as a na-na-na-na-na problem. Originally developed for the Air Force, the minute the thing was on the drafting table, the Navy and Marines started crying, "Hey, what about us?" To quiet the jealous fit being thrown by the other services, the Pentagon agreed to turn the thing into the "Swiss Army knife" it has become.
A variant capable of taking off from and landing on carriers was promised to the Navy, with bigger wings and a tail hook. Except the tail hook refused to work for the first two years it was tested, meaning that every carrier landing had to take place in sight of land so the Navy F-35 could fly over to the coast and land safely on a runway.
The Marine variety had to be capable of vertical takeoff and landing, because the Navy was jealous of its carriers and would only agree to allow the Marines to have mini-carriers with landing surfaces big enough for vertical use. That meant the Marine version had to be redesigned so it had a big flap under the engine to divert thrust so the thing could land on Marine ships. This meant the Marine version had added weight and space that would otherwise be used to carry weapons.
So you're a Marine, and you're flying along in your F-35 and an enemy comes along and starts shooting at you, and you shoot back and miss, but you don't have another missile, because where that missile should be is where your damn vertical landing flap is.
Maybe they should just issue F-35 pilots a bunch of flags to use when they take to the air, and then they'd be ready for anything. Tail starts coming off because you went supersonic for too long? Fly your NO FAIR flag. Cockpit delaminating? Grab your JUST A MINUTE I can't see you flag. Engine rotor blades burning up? That would be the OOOPS can't dogfight right now, I'm waiting on a replacement engine flag.
Not to worry, pilots, the Pentagon is on the problem and they have a solution. Brown says they're going back to the drawing board for a "fifth generation-minus" fighter jet, meaning they want to come up with something that looks like and flies like and has the combat capabilities of the good old F-16. Only problem is, if you use the F-35 project as a benchmark, it will be two decades before the "minus" jet is operational. Until then, guys, have fun watching your F-35's gather dust on the tarmac while you continue to fly your F-16's, which will be older than the average pilot's grandfather by the time the new plane is ready.
Don't worry: If you're concerned about rising Federal debt -- read this
Dean Baker, DC Report @ Raw Story
February 21, 2021
The question above may seem silly.
Of course, they will know because there are a number of well-funded policy shops that will be spewing out endless papers and columns telling them that they are facing a crushing debt burden.
Because these policy shops are well-funded and well-connected we can be sure that major media outlets, like The New York Times, The Washington Post and National Public Radio, will give their complaints plenty of space.
But let's imagine a world where our children weren't constantly being told that they face a crushing debt burden.
How would they know?
Even with a higher tax burden due to the debt we are now building up, workers 10 years out should enjoy substantially higher living standards than they do today.
It might be hard if the latest budget projects are close to the mark. The Congressional Budget Office (CBO) just released new projections for the budget and the economy.
They show that in 2031, the last year in their budget horizon, the interest burden on our debt will be 2.4% of Gross Domestic Product. That's up from current interest costs of 1.4% of GDP. That implies an increase in the debt burden, measured by interest costs, of 1.0 percentage point of GDP.
Side note 1: The true debt burden is actually somewhat less. Last year the Federal Reserve Board refunded $88 billion, roughly 0.4% of GDP, to the Treasury. This was based on interest that it had collected on the bonds it held. That leaves the actual interest burden at around 1% of GDP.
Will an interest burden of 2.4% of GDP crush our children?
On the face of it, the deficit hawks have a hard case here. The interest burden was over 3.0% of GDP for most of the early and mid-1990s. And for those who were not around or have forgotten, the 1990s, or at least the second half, was a very prosperous decade. It's a bit hard to see how an interest burden of 2.4% of GDP can be crushing if burdens of more than 3.0% of GDP were not a big problem.
Imagining Even More Debt
But, the debt burden may be higher than the current projections show. After all, President Biden has proposed a $1.9 trillion pandemic rescue package. He also will have other spending initiatives. The CBO baseline includes tax increases in current law that may not actually go into effect.
CBO's latest projections put the debt (currently $22 trillion) at $35.3 trillion in 2031.
Let's assume that the rescue package and other issues raise the debt for that year by 10%, or $3.5 trillion. This brings the interest burden to 2.7% of GDP. That's still below the 1990s level.
Furthermore, insofar as the rescue package and other initiatives are successful in boosting growth, GDP, the denominator in this calculation, will be larger, which will at least partially offset the higher interest burden.
One point the deficit hawks add to this calculation is that interest rates are extraordinarily low at present.
CBO does project that interest rates will rise, But in 2031 they still project an interest rate on 10-year Treasury bonds of just 3%. This is up from 1.1% at present, but still well below the rates we saw over the 40 years before the Great Recession. It certainly is not impossible that interest rates will rise to 4% or even 5%.
Higher Interest Rates
Higher rates will mean that the debt poses a greater interest burden. But there are a couple of important qualifications that need to be made. First, much of our debt is long-term. The 30-year bond issued in 2021 at a 2% interest rate doesn't have to be refinanced until 2051. That means that even if interest rates do rise substantially they will only gradually lead to a substantially higher interest burden.
The other point is that we have to ask about the reason interest rates are rising.
It is possible that interest rates will be rising even as the inflation rate remains more or less in line with CBO's latest projection of around 2%. In that case, higher interest rates mean a greater burden over time.
However, interest rates may also rise because we see higher than projected inflation. Suppose the inflation rate rises to 3%, roughly a percentage point higher than projected. If interest rates also rise by a percentage point, so that the interest rate on a 10-year Treasury bond in 2031 is 4%, instead of 3%, we would still be looking at the same real interest rate. In that case, the value of the bond would be eroded by an extra 1 percentage point annually, due to the impact of higher inflation.
In the case where higher inflation is the reason for higher interest rates, the actual burden of the debt does not change.
With nominal GDP growing more rapidly due to the higher inflation, the ratio of debt to GDP would be lower than in the case with lower inflation. This means that we only need to worry about a higher interest burden if interest rates rise without a corresponding increase in the rate of inflation.[...] (READ MORE)
Of course, they will know because there are a number of well-funded policy shops that will be spewing out endless papers and columns telling them that they are facing a crushing debt burden.
Because these policy shops are well-funded and well-connected we can be sure that major media outlets, like The New York Times, The Washington Post and National Public Radio, will give their complaints plenty of space.
But let's imagine a world where our children weren't constantly being told that they face a crushing debt burden.
How would they know?
Even with a higher tax burden due to the debt we are now building up, workers 10 years out should enjoy substantially higher living standards than they do today.
It might be hard if the latest budget projects are close to the mark. The Congressional Budget Office (CBO) just released new projections for the budget and the economy.
They show that in 2031, the last year in their budget horizon, the interest burden on our debt will be 2.4% of Gross Domestic Product. That's up from current interest costs of 1.4% of GDP. That implies an increase in the debt burden, measured by interest costs, of 1.0 percentage point of GDP.
Side note 1: The true debt burden is actually somewhat less. Last year the Federal Reserve Board refunded $88 billion, roughly 0.4% of GDP, to the Treasury. This was based on interest that it had collected on the bonds it held. That leaves the actual interest burden at around 1% of GDP.
Will an interest burden of 2.4% of GDP crush our children?
On the face of it, the deficit hawks have a hard case here. The interest burden was over 3.0% of GDP for most of the early and mid-1990s. And for those who were not around or have forgotten, the 1990s, or at least the second half, was a very prosperous decade. It's a bit hard to see how an interest burden of 2.4% of GDP can be crushing if burdens of more than 3.0% of GDP were not a big problem.
Imagining Even More Debt
But, the debt burden may be higher than the current projections show. After all, President Biden has proposed a $1.9 trillion pandemic rescue package. He also will have other spending initiatives. The CBO baseline includes tax increases in current law that may not actually go into effect.
CBO's latest projections put the debt (currently $22 trillion) at $35.3 trillion in 2031.
Let's assume that the rescue package and other issues raise the debt for that year by 10%, or $3.5 trillion. This brings the interest burden to 2.7% of GDP. That's still below the 1990s level.
Furthermore, insofar as the rescue package and other initiatives are successful in boosting growth, GDP, the denominator in this calculation, will be larger, which will at least partially offset the higher interest burden.
One point the deficit hawks add to this calculation is that interest rates are extraordinarily low at present.
CBO does project that interest rates will rise, But in 2031 they still project an interest rate on 10-year Treasury bonds of just 3%. This is up from 1.1% at present, but still well below the rates we saw over the 40 years before the Great Recession. It certainly is not impossible that interest rates will rise to 4% or even 5%.
Higher Interest Rates
Higher rates will mean that the debt poses a greater interest burden. But there are a couple of important qualifications that need to be made. First, much of our debt is long-term. The 30-year bond issued in 2021 at a 2% interest rate doesn't have to be refinanced until 2051. That means that even if interest rates do rise substantially they will only gradually lead to a substantially higher interest burden.
The other point is that we have to ask about the reason interest rates are rising.
It is possible that interest rates will be rising even as the inflation rate remains more or less in line with CBO's latest projection of around 2%. In that case, higher interest rates mean a greater burden over time.
However, interest rates may also rise because we see higher than projected inflation. Suppose the inflation rate rises to 3%, roughly a percentage point higher than projected. If interest rates also rise by a percentage point, so that the interest rate on a 10-year Treasury bond in 2031 is 4%, instead of 3%, we would still be looking at the same real interest rate. In that case, the value of the bond would be eroded by an extra 1 percentage point annually, due to the impact of higher inflation.
In the case where higher inflation is the reason for higher interest rates, the actual burden of the debt does not change.
With nominal GDP growing more rapidly due to the higher inflation, the ratio of debt to GDP would be lower than in the case with lower inflation. This means that we only need to worry about a higher interest burden if interest rates rise without a corresponding increase in the rate of inflation.[...] (READ MORE)
Corporate concentration in the US food system makes food more expensive and less accessible for many Americans
the conversation
February 8, 2021 8.37am EST
Agribusiness executives and government policymakers often praise the U.S. food system for producing abundant and affordable food. In fact, however, food costs are rising, and shoppers in many parts of the U.S. have limited access to fresh, healthy products.
This isn’t just an academic argument. Even before the current pandemic, millions of people in the U.S. went hungry. In 2019 the U.S. Department of Agriculture estimated that over 35 million people were “food insecure,” meaning they did not have reliable access to affordable, nutritious food. Now food banks are struggling to feed people who have lost jobs and income thanks to COVID-19.
As rural sociologists, we study changes in food systems and sustainability. We’ve closely followed corporate consolidation of food production, processing and distribution in the U.S. over the past 40 years. In our view, this process is making food less available or affordable for many Americans.
Fewer, larger companies
Consolidation has placed key decisions about our nation’s food system in the hands of a few large companies, giving them outsized influence to lobby policymakers, direct food and industry research and influence media coverage. These corporations also have enormous power to make decisions about what food is produced how, where and by whom, and who gets to eat it. We’ve tracked this trend across the globe.
It began in the 1980s with mergers and acquisitions that left a few large firms dominating nearly every step of the food chain. Among the largest are retailer Walmart, food processor Nestlé and seed/chemical firm Bayer.
Some corporate leaders have abused their power – for example, by allying with their few competitors to fix prices. In 2020 Christopher Lischewski, the former president and CEO of Bumblebee Foods, was convicted of conspiracy to fix prices of canned tuna. He was sentenced to 40 months in prison and fined US$100,000.
In the same year, chicken processor Pilgrim’s Pride pleaded guilty to price-fixing charges and was fined $110.5 million. Meatpacking company JBS settled a $24.5 million pork price-fixing lawsuit, and farmers won a class action settlement against peanut-shelling companies Olam and Birdsong.
Industry consolidation is hard to track. Many subsidiary firms often are controlled by one parent corporation and engage in “contract packing,” in which a single processing plant produces identical foods that are then sold under dozens of different brands – including labels that compete directly against each other.
Recalls ordered in response to food-borne disease outbreaks have revealed the broad scope of contracting relationships. Shutdowns at meatpacking plants due to COVID-19 infections among workers have shown how much of the U.S. food supply flows through a small number of facilities.
With consolidation, large supermarket chains have closed many urban and rural stores. This process has left numerous communities with limited food selections and high prices – especially neighborhoods with many low-income, Black or Latino households.
Widespread hunger
As unemployment has risen during the pandemic, so has the number of hungry Americans. Feeding America, a nationwide network of food banks, estimates that up to 50 million people – including 17 million children – may currently be experiencing food insecurity. Nationwide, demand at food banks grew by over 48% during the first half of 2020.
Simultaneously, disruptions in food supply chains forced farmers to dump milk down the drain, leave produce rotting in fields and euthanize livestock that could not be processed at slaughterhouses. We estimate that between March and May of 2020, farmers disposed of somewhere between 300,000 and 800,000 hogs and 2 million chickens – more than 30,000 tons of meat.
What role does concentration play in this situation? Research shows that retail concentration correlates with higher prices for consumers. It also shows that when food systems have fewer production and processing sites, disruptions can have major impacts on supply.
Consolidation makes it easier for any industry to maintain high prices. With few players, companies simply match each other’s price increases rather than competing with them. Concentration in the U.S. food system has raised the costs of everything from breakfast cereal and coffee to beer.
As the pandemic roiled the nation’s food system through 2020, consumer food costs rose by 3.4%, compared to 0.4% in 2018 and 0.9% in 2019. We expect retail prices to remain high because they are “sticky,” with a tendency to increase rapidly but to decline more slowly and only partially.
We also believe there could be further supply disruptions. A few months into the pandemic, meat shelves in some U.S. stores sat empty, while some of the nation’s largest processors were exporting record amounts of meat to China. U.S. Sens. Elizabeth Warren, D-Mass., and Cory Booker, D-N.J., cited this imbalance as evidence of the need to crack down on what they called “monopolistic practices” by Tyson Foods, Cargill, JBS and Smithfield, which dominate the U.S. meatpacking industry.
Tyson Foods responded that a large portion of its exports were “cuts of meat or portions of the animal that are not desired by” Americans. Store shelves are no longer empty for most cuts of meat, but processing plants remain overbooked, with many scheduling well into 2021.
Toward a more equitable food system
In our view, a resilient food system that feeds everyone can be achieved only through a more equitable distribution of power. This in turn will require action in areas ranging from contract law and antitrust policy to workers’ rights and economic development. Farmers, workers, elected officials and communities will have to work together to fashion alternatives and change policies.
The goal should be to produce more locally sourced food with shorter and less-centralized supply chains. Detroit offers an example. Over the past 50 years, food producers there have established more than 1,900 urban farms and gardens. A planned community-owned food co-op will serve the city’s North End, whose residents are predominantly low- and moderate-income and African American.
The federal government can help by adapting farm support programs to target farms and businesses that serve local and regional markets. State and federal incentives can build community- or cooperative-owned farms and processing and distribution businesses. Ventures like these could provide economic development opportunities while making the food system more resilient.
In our view, the best solutions will come from listening to and working with the people most affected: sustainable farmers, farm and food service workers, entrepreneurs and cooperators – and ultimately, the people whom they feed.
This isn’t just an academic argument. Even before the current pandemic, millions of people in the U.S. went hungry. In 2019 the U.S. Department of Agriculture estimated that over 35 million people were “food insecure,” meaning they did not have reliable access to affordable, nutritious food. Now food banks are struggling to feed people who have lost jobs and income thanks to COVID-19.
As rural sociologists, we study changes in food systems and sustainability. We’ve closely followed corporate consolidation of food production, processing and distribution in the U.S. over the past 40 years. In our view, this process is making food less available or affordable for many Americans.
Fewer, larger companies
Consolidation has placed key decisions about our nation’s food system in the hands of a few large companies, giving them outsized influence to lobby policymakers, direct food and industry research and influence media coverage. These corporations also have enormous power to make decisions about what food is produced how, where and by whom, and who gets to eat it. We’ve tracked this trend across the globe.
It began in the 1980s with mergers and acquisitions that left a few large firms dominating nearly every step of the food chain. Among the largest are retailer Walmart, food processor Nestlé and seed/chemical firm Bayer.
Some corporate leaders have abused their power – for example, by allying with their few competitors to fix prices. In 2020 Christopher Lischewski, the former president and CEO of Bumblebee Foods, was convicted of conspiracy to fix prices of canned tuna. He was sentenced to 40 months in prison and fined US$100,000.
In the same year, chicken processor Pilgrim’s Pride pleaded guilty to price-fixing charges and was fined $110.5 million. Meatpacking company JBS settled a $24.5 million pork price-fixing lawsuit, and farmers won a class action settlement against peanut-shelling companies Olam and Birdsong.
Industry consolidation is hard to track. Many subsidiary firms often are controlled by one parent corporation and engage in “contract packing,” in which a single processing plant produces identical foods that are then sold under dozens of different brands – including labels that compete directly against each other.
Recalls ordered in response to food-borne disease outbreaks have revealed the broad scope of contracting relationships. Shutdowns at meatpacking plants due to COVID-19 infections among workers have shown how much of the U.S. food supply flows through a small number of facilities.
With consolidation, large supermarket chains have closed many urban and rural stores. This process has left numerous communities with limited food selections and high prices – especially neighborhoods with many low-income, Black or Latino households.
Widespread hunger
As unemployment has risen during the pandemic, so has the number of hungry Americans. Feeding America, a nationwide network of food banks, estimates that up to 50 million people – including 17 million children – may currently be experiencing food insecurity. Nationwide, demand at food banks grew by over 48% during the first half of 2020.
Simultaneously, disruptions in food supply chains forced farmers to dump milk down the drain, leave produce rotting in fields and euthanize livestock that could not be processed at slaughterhouses. We estimate that between March and May of 2020, farmers disposed of somewhere between 300,000 and 800,000 hogs and 2 million chickens – more than 30,000 tons of meat.
What role does concentration play in this situation? Research shows that retail concentration correlates with higher prices for consumers. It also shows that when food systems have fewer production and processing sites, disruptions can have major impacts on supply.
Consolidation makes it easier for any industry to maintain high prices. With few players, companies simply match each other’s price increases rather than competing with them. Concentration in the U.S. food system has raised the costs of everything from breakfast cereal and coffee to beer.
As the pandemic roiled the nation’s food system through 2020, consumer food costs rose by 3.4%, compared to 0.4% in 2018 and 0.9% in 2019. We expect retail prices to remain high because they are “sticky,” with a tendency to increase rapidly but to decline more slowly and only partially.
We also believe there could be further supply disruptions. A few months into the pandemic, meat shelves in some U.S. stores sat empty, while some of the nation’s largest processors were exporting record amounts of meat to China. U.S. Sens. Elizabeth Warren, D-Mass., and Cory Booker, D-N.J., cited this imbalance as evidence of the need to crack down on what they called “monopolistic practices” by Tyson Foods, Cargill, JBS and Smithfield, which dominate the U.S. meatpacking industry.
Tyson Foods responded that a large portion of its exports were “cuts of meat or portions of the animal that are not desired by” Americans. Store shelves are no longer empty for most cuts of meat, but processing plants remain overbooked, with many scheduling well into 2021.
Toward a more equitable food system
In our view, a resilient food system that feeds everyone can be achieved only through a more equitable distribution of power. This in turn will require action in areas ranging from contract law and antitrust policy to workers’ rights and economic development. Farmers, workers, elected officials and communities will have to work together to fashion alternatives and change policies.
The goal should be to produce more locally sourced food with shorter and less-centralized supply chains. Detroit offers an example. Over the past 50 years, food producers there have established more than 1,900 urban farms and gardens. A planned community-owned food co-op will serve the city’s North End, whose residents are predominantly low- and moderate-income and African American.
The federal government can help by adapting farm support programs to target farms and businesses that serve local and regional markets. State and federal incentives can build community- or cooperative-owned farms and processing and distribution businesses. Ventures like these could provide economic development opportunities while making the food system more resilient.
In our view, the best solutions will come from listening to and working with the people most affected: sustainable farmers, farm and food service workers, entrepreneurs and cooperators – and ultimately, the people whom they feed.
Even Biden’s $1.9 Trillion Isn’t Nearly Enough Pandemic Relief
And the Miserly Proposal of Senate Republicans Shows Again That They Care Only About White Americans
By David Cay Johnston, DCReport Editor-in-Chief
2/4/2021
President Joe Biden wants a $1.9 trillion pandemic relief package; Senate Republicans only a third that much. Both proposals are too little for too short a time. Even more important, both propose too little for where help is needed most.
The Republicans say America just can’t afford more relief. Their skimpy plan would provide nothing for renters facing eviction, just the latest sign of how since Trump the Republicans have chosen to become the party of white skin privilege since it’s Black and Latino renters most at risk of eviction.
The GOP would do little for small business, offering only 10 cents on the Biden dollar; to reopen schools, 12 cents; for the jobless, 34 cents.
Complaints that America can’t afford the Biden relief package ring hollow. The Trump-Radical Republican tax cuts—passed in December 2017 without a single vote from a Democrat—used borrowed money to bestow $2.4 trillion in tax savings to large corporations and rich individuals. The 99% got crumbs.
Biden’s tax plan would raise $2.1 trillion over 10 years, taking back much of the savings from large companies and individuals making more than $400,000 per year.
Welfare for the Rich
The COVID-19 relief package Trump signed into law last spring was heavily weighted to those who didn’t need help. As the graphic below shows, such excess cash nearly doubled from when Trump assumed office and last May. Only a little of that money has been withdrawn, an indication that federal aid included a lot of welfare for the rich.
During the pandemic, many millions of American households have reduced their debts and increased their savings because they had continued to be employed while their spending dropped. They don’t need relief.
Since your spending is my income, and vice versa, the people suffering because of the drop in spending do need help. Think of restaurant workers, barbers, gym trainers, and retail store clerks.
Invisible Suffering
In a nation whose political and economic power structure is largely white and in which large numbers of Americans have few to no minorities as neighbors, it’s easy for those suffering the most to be invisible.
One of the striking patterns I’ve noticed in print and broadcast interviews with COVID-19 deniers, the people who reject masks and social distancing, is that they are overwhelmingly white and often say they don’t know anyone who died from the virus. These people let their anecdotal experience trump what the data show.
It’s easy to never notice the depth and breadth of American poverty because it’s highly concentrated in neighborhoods with aging houses and apartments, little bus service, and often far from good jobs. There’s plenty of rural poverty, too, which is even less likely to be noticed by the majority culture.
The data show the tremendous suffering in places like Rochester, N.Y., a once fabulously rich city. These days every sixth Rochester resident subsists on less than half the income needed to escape poverty.
For a family of four, that’s $1,000 per month or less, hardly enough for a single person to pay bills. Rochester is far from alone among once rich cities turned poor.
$25,000 a Year
The official federal poverty measure, created when Lyndon Johnson was president, is badly outdated. It assumes that an income of about $500 per week or about $25,000 per year is enough for a family of four to escape poverty. That’s not even true in rural areas where housing is dirt cheap.
A newer and much more reliable poverty measure called ALICE should guide Congress.
ALICE is an acronym for Asset Limited, Income Constrained, Employed – people who work but don’t have savings and find it hard to make ends meet. The concept was developed by the nonprofit United Way. Think of the majority of American households who have less than $1,000 in savings and often nothing at all.
ALICE indicates that to meet basic needs a family of four in most urban areas needs about $1,200 per week—more than $62,000 per year. That’s not much less than the median household income these days of about $68,000.
$10 Million a Year
For perspective, a third of American households make less than $25,000; 90% make less than $134,000. The fastest growth is among the $10 million-and-up class that now has more than 24,000 households.
The economic carnage from Trump’s incompetent and malicious response to the coronavirus has been concentrated in poor and minority neighborhoods. In large part, that’s because close to two-thirds of higher-income Americans have jobs they can do from home, only about 9% of lower-paid workers enjoy that privilege.
And many of these lower-paid workers are deemed essential, risking their health and sometimes dying so we have clean floors in hospitals, food on grocery store shelves, and other fundamental services.
Many of those hardest hit are workers who show up and do their jobs but have had their hours cut or their jobs eliminated. That’s where relief needs to be concentrated – extending jobless benefits and making sure people aren’t evicted, which can make them homeless and set them back financially for years.
Families Facing Eviction
As many as 40 million families are at risk of eviction because they can’t pay the rent. Miles long lines of cars at food giveaways shown in television news reports attest to the depth of the income problem from those made jobless by the pandemic.
Relief needs to help both renters and their landlords.
Black and Hispanic renters are “twice as likely as white renter households to be behind on housing payments and twice as likely to report being at risk of eviction,” said Sophia Wedeen of Harvard University’s Joint Center for Housing Studies.
Giving poor renters cash grants to cover their rent would be the cleanest way to help landlords avoid foreclosure. That would also save on the social costs of dealing with people who become homeless, lose their household furniture, and may not recover for years.
Loan Thieves
That first round of help also attracted hordes of thieves. The suspected thefts exceed $4 billion. A rapidly growing list of borrowers is now charged with making up vast numbers of employees so they could get Payroll Protection Plan loans that don’t have to be paid back. One North Carolina man was indicted for fraudulently collecting $5.5 million using 12 companies with wildly inflated payroll numbers.
One of the greatest needs is for more and longer-lasting jobless benefits including an expansion of eligibility. I’ve talked to small business owners whose operations are seasonal and failed before the pandemic or otherwise fell between the cracks in qualifying for the $400 per week in jobless benefits and the $600 per week that was in the initial relief bill.
Congress should make the $400 per week of federal jobless benefits open-ended just like the pandemic’s effects are open-ended. Otherwise, as the pandemic drags on because the Trump administration failed to plan for distributing vaccines there will have to be another vote in Congress on extending relief. Take care of it all. Now.
The Republicans say America just can’t afford more relief. Their skimpy plan would provide nothing for renters facing eviction, just the latest sign of how since Trump the Republicans have chosen to become the party of white skin privilege since it’s Black and Latino renters most at risk of eviction.
The GOP would do little for small business, offering only 10 cents on the Biden dollar; to reopen schools, 12 cents; for the jobless, 34 cents.
Complaints that America can’t afford the Biden relief package ring hollow. The Trump-Radical Republican tax cuts—passed in December 2017 without a single vote from a Democrat—used borrowed money to bestow $2.4 trillion in tax savings to large corporations and rich individuals. The 99% got crumbs.
Biden’s tax plan would raise $2.1 trillion over 10 years, taking back much of the savings from large companies and individuals making more than $400,000 per year.
Welfare for the Rich
The COVID-19 relief package Trump signed into law last spring was heavily weighted to those who didn’t need help. As the graphic below shows, such excess cash nearly doubled from when Trump assumed office and last May. Only a little of that money has been withdrawn, an indication that federal aid included a lot of welfare for the rich.
During the pandemic, many millions of American households have reduced their debts and increased their savings because they had continued to be employed while their spending dropped. They don’t need relief.
Since your spending is my income, and vice versa, the people suffering because of the drop in spending do need help. Think of restaurant workers, barbers, gym trainers, and retail store clerks.
Invisible Suffering
In a nation whose political and economic power structure is largely white and in which large numbers of Americans have few to no minorities as neighbors, it’s easy for those suffering the most to be invisible.
One of the striking patterns I’ve noticed in print and broadcast interviews with COVID-19 deniers, the people who reject masks and social distancing, is that they are overwhelmingly white and often say they don’t know anyone who died from the virus. These people let their anecdotal experience trump what the data show.
It’s easy to never notice the depth and breadth of American poverty because it’s highly concentrated in neighborhoods with aging houses and apartments, little bus service, and often far from good jobs. There’s plenty of rural poverty, too, which is even less likely to be noticed by the majority culture.
The data show the tremendous suffering in places like Rochester, N.Y., a once fabulously rich city. These days every sixth Rochester resident subsists on less than half the income needed to escape poverty.
For a family of four, that’s $1,000 per month or less, hardly enough for a single person to pay bills. Rochester is far from alone among once rich cities turned poor.
$25,000 a Year
The official federal poverty measure, created when Lyndon Johnson was president, is badly outdated. It assumes that an income of about $500 per week or about $25,000 per year is enough for a family of four to escape poverty. That’s not even true in rural areas where housing is dirt cheap.
A newer and much more reliable poverty measure called ALICE should guide Congress.
ALICE is an acronym for Asset Limited, Income Constrained, Employed – people who work but don’t have savings and find it hard to make ends meet. The concept was developed by the nonprofit United Way. Think of the majority of American households who have less than $1,000 in savings and often nothing at all.
ALICE indicates that to meet basic needs a family of four in most urban areas needs about $1,200 per week—more than $62,000 per year. That’s not much less than the median household income these days of about $68,000.
$10 Million a Year
For perspective, a third of American households make less than $25,000; 90% make less than $134,000. The fastest growth is among the $10 million-and-up class that now has more than 24,000 households.
The economic carnage from Trump’s incompetent and malicious response to the coronavirus has been concentrated in poor and minority neighborhoods. In large part, that’s because close to two-thirds of higher-income Americans have jobs they can do from home, only about 9% of lower-paid workers enjoy that privilege.
And many of these lower-paid workers are deemed essential, risking their health and sometimes dying so we have clean floors in hospitals, food on grocery store shelves, and other fundamental services.
Many of those hardest hit are workers who show up and do their jobs but have had their hours cut or their jobs eliminated. That’s where relief needs to be concentrated – extending jobless benefits and making sure people aren’t evicted, which can make them homeless and set them back financially for years.
Families Facing Eviction
As many as 40 million families are at risk of eviction because they can’t pay the rent. Miles long lines of cars at food giveaways shown in television news reports attest to the depth of the income problem from those made jobless by the pandemic.
Relief needs to help both renters and their landlords.
Black and Hispanic renters are “twice as likely as white renter households to be behind on housing payments and twice as likely to report being at risk of eviction,” said Sophia Wedeen of Harvard University’s Joint Center for Housing Studies.
Giving poor renters cash grants to cover their rent would be the cleanest way to help landlords avoid foreclosure. That would also save on the social costs of dealing with people who become homeless, lose their household furniture, and may not recover for years.
Loan Thieves
That first round of help also attracted hordes of thieves. The suspected thefts exceed $4 billion. A rapidly growing list of borrowers is now charged with making up vast numbers of employees so they could get Payroll Protection Plan loans that don’t have to be paid back. One North Carolina man was indicted for fraudulently collecting $5.5 million using 12 companies with wildly inflated payroll numbers.
One of the greatest needs is for more and longer-lasting jobless benefits including an expansion of eligibility. I’ve talked to small business owners whose operations are seasonal and failed before the pandemic or otherwise fell between the cracks in qualifying for the $400 per week in jobless benefits and the $600 per week that was in the initial relief bill.
Congress should make the $400 per week of federal jobless benefits open-ended just like the pandemic’s effects are open-ended. Otherwise, as the pandemic drags on because the Trump administration failed to plan for distributing vaccines there will have to be another vote in Congress on extending relief. Take care of it all. Now.
a party for the stupid and racists!!!
Opinion
The Economy Does Much Better Under Democrats. Why?
G.D.P., jobs and other indicators have all risen more slowly under Republicans for nearly the past century.
By David Leonhardt - ny times
Feb. 2, 2021
A president has only limited control over the economy. And yet there has been a stark pattern in the United States for nearly a century. The economy has grown significantly faster under Democratic presidents than Republican ones.
It’s true about almost any major indicator: gross domestic product, employment, incomes, productivity, even stock prices. It’s true if you examine only the precise period when a president is in office, or instead assume that a president’s policies affect the economy only after a lag and don’t start his economic clock until months after he takes office. The gap “holds almost regardless of how you define success,” two economics professors at Princeton, Alan Blinder and Mark Watson, write. They describe it as “startlingly large.”
Since 1933, the economy has grown at an annual average rate of 4.6 percent under Democratic presidents and 2.4 percent under Republicans, according to a Times analysis. In more concrete terms: The average income of Americans would be more than double its current level if the economy had somehow grown at the Democratic rate for all of the past nine decades. If anything, that period (which is based on data availability) is too kind to Republicans, because it excludes the portion of the Great Depression that happened on Herbert Hoover’s watch.
The six presidents who have presided over the fastest job growth have all been Democrats, as you can see above. The four presidents who have presided over the slowest growth have all been Republicans.
---
What, then, are the most plausible theories?
First, it’s worth rejecting a few unlikely possibilities. Congressional control is not the answer. The pattern holds regardless of which party is running Congress. Deficit spending also doesn’t explain the gap: It is not the case that Democrats juice the economy by spending money and then leave Republicans to clean up the mess. Over the last four decades, in fact, Republican presidents have run up larger deficits than Democrats.
That leaves one broad possibility with a good amount of supporting evidence: Democrats have been more willing to heed economic and historical lessons about what policies actually strengthen the economy, while Republicans have often clung to theories that they want to believe — like the supposedly magical power of tax cuts and deregulation. Democrats, in short, have been more pragmatic.
---
For the most part, however, Republican economic policy since 1980 has revolved around a single policy: large tax cuts, tilted heavily toward the affluent. There are situations in which tax cuts can lift economic growth, but they typically involve countries with very high tax rates. The United States has had fairly low tax rates for decades.
The evidence now overwhelmingly suggests that recent tax cuts have had only a modest effect on the economy. G.D.P. grew at virtually the same rate after the 2017 Trump tax cut as before it. If anything, the Clinton tax increase of 1993 has a better claim on starting a boom than any tax cut since.
One possibility is that the two parties are both responding to the interest groups that support and finance them, suggested Ms. Wanamaker, who worked in the White House Council of Economic Advisers during the Trump administration. But the Democratic-leaning groups (like labor unions and civil-rights organizations) may favor policies that lift broad-based economic growth, while Republican-leaning groups (like the wealthy) favor policies that mostly shift income toward themselves.
These explanations are almost certainly not complete. Much of the partisan gap remains mysterious. At the end of their academic paper on it, Mr. Blinder, a former Federal Reserve vice chairman and Clinton administration official, and Mr. Watson encourage other economists to study the issue.
But if the causes are not fully clear, the pattern is. The American economy has performed much better under Democratic administrations than Republican ones, over both the last few decades and the last century. And as Ms. Wanamaker said, “Administrations do certainly have the ability to affect economic outcomes.”
It’s true about almost any major indicator: gross domestic product, employment, incomes, productivity, even stock prices. It’s true if you examine only the precise period when a president is in office, or instead assume that a president’s policies affect the economy only after a lag and don’t start his economic clock until months after he takes office. The gap “holds almost regardless of how you define success,” two economics professors at Princeton, Alan Blinder and Mark Watson, write. They describe it as “startlingly large.”
Since 1933, the economy has grown at an annual average rate of 4.6 percent under Democratic presidents and 2.4 percent under Republicans, according to a Times analysis. In more concrete terms: The average income of Americans would be more than double its current level if the economy had somehow grown at the Democratic rate for all of the past nine decades. If anything, that period (which is based on data availability) is too kind to Republicans, because it excludes the portion of the Great Depression that happened on Herbert Hoover’s watch.
The six presidents who have presided over the fastest job growth have all been Democrats, as you can see above. The four presidents who have presided over the slowest growth have all been Republicans.
---
What, then, are the most plausible theories?
First, it’s worth rejecting a few unlikely possibilities. Congressional control is not the answer. The pattern holds regardless of which party is running Congress. Deficit spending also doesn’t explain the gap: It is not the case that Democrats juice the economy by spending money and then leave Republicans to clean up the mess. Over the last four decades, in fact, Republican presidents have run up larger deficits than Democrats.
That leaves one broad possibility with a good amount of supporting evidence: Democrats have been more willing to heed economic and historical lessons about what policies actually strengthen the economy, while Republicans have often clung to theories that they want to believe — like the supposedly magical power of tax cuts and deregulation. Democrats, in short, have been more pragmatic.
---
For the most part, however, Republican economic policy since 1980 has revolved around a single policy: large tax cuts, tilted heavily toward the affluent. There are situations in which tax cuts can lift economic growth, but they typically involve countries with very high tax rates. The United States has had fairly low tax rates for decades.
The evidence now overwhelmingly suggests that recent tax cuts have had only a modest effect on the economy. G.D.P. grew at virtually the same rate after the 2017 Trump tax cut as before it. If anything, the Clinton tax increase of 1993 has a better claim on starting a boom than any tax cut since.
One possibility is that the two parties are both responding to the interest groups that support and finance them, suggested Ms. Wanamaker, who worked in the White House Council of Economic Advisers during the Trump administration. But the Democratic-leaning groups (like labor unions and civil-rights organizations) may favor policies that lift broad-based economic growth, while Republican-leaning groups (like the wealthy) favor policies that mostly shift income toward themselves.
These explanations are almost certainly not complete. Much of the partisan gap remains mysterious. At the end of their academic paper on it, Mr. Blinder, a former Federal Reserve vice chairman and Clinton administration official, and Mr. Watson encourage other economists to study the issue.
But if the causes are not fully clear, the pattern is. The American economy has performed much better under Democratic administrations than Republican ones, over both the last few decades and the last century. And as Ms. Wanamaker said, “Administrations do certainly have the ability to affect economic outcomes.”
Here's the real difference between Republicans and Democrats — and why it matters
John Stoehr - alternet
January 22, 2021
Let's not be naive about American politics. The truth is the Democrats are as self-interested and calculating as the Republicans are. The difference, however, must be said. While the Democrats are as partisan as the Republicans are, their partisanship runs more or less in the direction of democracy and its first and second principles, which are equality and freedom, in that order. If a political party gets what it wants in the pursuit of delivering something most people want most of the time, then so be it. That's not a cynical observation. Instead, it's an observation of a healthy republic.
The same cannot be said of the Republicans. Their interests do not run in the direction of democracy, especially not equality. Freedom is important, but by that, they mean the freedom of the dynastic and monied few to flex power over the many. Democracy is not an objective, because it and its first principle give power, rights and privilege to people who contemporary Republicans do not believe deserve them. Consider the meaning of "special interests." For the Republicans, it's the rich, businesses and corporations, almost always. For the Democrats, it's Black people, LGBTQ, teachers, labor, etc. Both parties are partisan. But only one of them is partisan in the service of most people.
If a party gets what it wants in the pursuit of delivering something most people want most of the time, so be it.
There's nothing morally wrong with being the party of corporate interests. There's nothing wrong, for that matter, with viewing politics as the preserve of the few, not the many. (Democracy does contain multitudes, after all.) What's wrong is lying about it. What's wrong is treating the opposition as if it does not have a legitimate claim. What's wrong is setting off a conflagration of white-power fury that consumes nearly everything, even the republic itself, in order to slake a thirst for power. The day Joe Biden decided to run for president was the day this white-power fury burned through Charlottesville, screaming, "Jews will not replace us." That day, according to published reports, is the day Biden chose to fight to "restore the soul of America."
Maybe he's full of it. Maybe Biden and the Democrats don't really believe what they say when they talk about everyone being in this together. That's certainly what the Republicans and their media allies believe. A critic said Thursday that we can expect to see from Biden "lofty rhetoric about unity, while acting below the radar to smash norms to implement the Left-wing agenda." The same day, a Times reporter asked the White House press secretary why the administration has not offered a bipartisan "fig leaf" to the Republicans, given the president putting so much emphasis on unity. Maybe the Democrats don't mean what they say. Maybe it's just politics-as-usual.
Please. We're all adults here, right? The Democrats are political animals, same as the Republicans. But, again, the difference must be said loud enough for a Times reporter stuck in a bubble of decadence and amorality to hear. While the Democrats are as partisan as the Republicans, their partisanship runs more or less in the direction of democracy and its first principles. Even if they don't mean what they say, the outcomes serve equality and freedom in that order. Even if Press Secretary Jennifer Psaki didn't believe what she said, she was right to suggest that what the Democrats want is what most people want, even Republicans: "Is unemployment insurance an issue that only Democrats in the country want? Do only Democrats want their kids to go back to schools? Do only Democrats want vaccines to be distributed across the country?"
The Republicans accuse the Democrats of not meaning what they say, because the Republicans nearly always don't mean what they say. If the Republican don't, it stands to reason the Democrats don't either. But again, a difference. The Democrats aim to serve the many. The Republicans aim to serve the few (without appearing to). If the Democrats don't mean what they say, the many still benefit. Not so for the GOP.
Moreover, the Republicans, because they don't mean what they say, can't be trusted to stand by their demands. Even if, say, the Republicans demanded greater border security in exchange for supporting Biden's immigration reform bill, they'd make up a reason why the Democrats reneged on their commitment in order to renege on theirs. Psaki knows that's precisely what they did to Barack Obama. Better to trust your instincts as to what's best of the country than to trust the Republicans to tell you.
The same cannot be said of the Republicans. Their interests do not run in the direction of democracy, especially not equality. Freedom is important, but by that, they mean the freedom of the dynastic and monied few to flex power over the many. Democracy is not an objective, because it and its first principle give power, rights and privilege to people who contemporary Republicans do not believe deserve them. Consider the meaning of "special interests." For the Republicans, it's the rich, businesses and corporations, almost always. For the Democrats, it's Black people, LGBTQ, teachers, labor, etc. Both parties are partisan. But only one of them is partisan in the service of most people.
If a party gets what it wants in the pursuit of delivering something most people want most of the time, so be it.
There's nothing morally wrong with being the party of corporate interests. There's nothing wrong, for that matter, with viewing politics as the preserve of the few, not the many. (Democracy does contain multitudes, after all.) What's wrong is lying about it. What's wrong is treating the opposition as if it does not have a legitimate claim. What's wrong is setting off a conflagration of white-power fury that consumes nearly everything, even the republic itself, in order to slake a thirst for power. The day Joe Biden decided to run for president was the day this white-power fury burned through Charlottesville, screaming, "Jews will not replace us." That day, according to published reports, is the day Biden chose to fight to "restore the soul of America."
Maybe he's full of it. Maybe Biden and the Democrats don't really believe what they say when they talk about everyone being in this together. That's certainly what the Republicans and their media allies believe. A critic said Thursday that we can expect to see from Biden "lofty rhetoric about unity, while acting below the radar to smash norms to implement the Left-wing agenda." The same day, a Times reporter asked the White House press secretary why the administration has not offered a bipartisan "fig leaf" to the Republicans, given the president putting so much emphasis on unity. Maybe the Democrats don't mean what they say. Maybe it's just politics-as-usual.
Please. We're all adults here, right? The Democrats are political animals, same as the Republicans. But, again, the difference must be said loud enough for a Times reporter stuck in a bubble of decadence and amorality to hear. While the Democrats are as partisan as the Republicans, their partisanship runs more or less in the direction of democracy and its first principles. Even if they don't mean what they say, the outcomes serve equality and freedom in that order. Even if Press Secretary Jennifer Psaki didn't believe what she said, she was right to suggest that what the Democrats want is what most people want, even Republicans: "Is unemployment insurance an issue that only Democrats in the country want? Do only Democrats want their kids to go back to schools? Do only Democrats want vaccines to be distributed across the country?"
The Republicans accuse the Democrats of not meaning what they say, because the Republicans nearly always don't mean what they say. If the Republican don't, it stands to reason the Democrats don't either. But again, a difference. The Democrats aim to serve the many. The Republicans aim to serve the few (without appearing to). If the Democrats don't mean what they say, the many still benefit. Not so for the GOP.
Moreover, the Republicans, because they don't mean what they say, can't be trusted to stand by their demands. Even if, say, the Republicans demanded greater border security in exchange for supporting Biden's immigration reform bill, they'd make up a reason why the Democrats reneged on their commitment in order to renege on theirs. Psaki knows that's precisely what they did to Barack Obama. Better to trust your instincts as to what's best of the country than to trust the Republicans to tell you.
Study of 50 years of tax cuts for rich confirms ‘trickle down’ theory is an absolute sham
December 16, 2020
By Common Dreams - raw story
Neoliberal gospel says that cutting taxes on the wealthy will eventually benefit everyone by boosting economic growth and reducing unemployment, but a new analysis of fiscal policies in 18 countries over the last 50 years reveals that progressive critics of “trickle down” theory have been right all along: supply-side economics fuels inequality, and the real beneficiaries of the right-wing approach to taxation are the super-rich.
“Cutting taxes on the rich increases top income shares, but has little effect on economic performance.”
—David Hope and Julian Limberg
The Economic Consequences of Major Tax Cuts for the Rich (pdf), a working paper published this month by the International Inequalities Institute at the London School of Economics and written by LSE’s David Hope and Julian Limberg of King’s College London, examines data from nearly 20 OECD countries, including the U.K. and the U.S., and finds that the past five decades have been characterized by “falling taxes on the rich in the advanced economies,” with “major tax cuts… particularly clustered in the late 1980s.”
But, according to Hope and Limberg, the vast majority of the populations in those countries have little to show for it, as the benefits of slashing taxes on the wealthy are concentrated among a handful of super-rich individuals—not widely shared across society in the form of improved job creation or prosperity, as “trickle down” theorists alleged would happen.
“Our research shows that the economic case for keeping taxes on the rich low is weak,” Hope said Wednesday. “Major tax cuts for the rich since the 1980s have increased income inequality, with all the problems that brings, without any offsetting gains in economic performance.”
In their study, the pair of political economists note that “economic performance, as measured by real GDP per capita and the unemployment rate, is not significantly affected by major tax cuts for the rich.” However, they add, “major tax cuts for the rich increase the top 1% share of pre-tax national income in the years following the reform” by a magnitude of nearly 1%.
The researchers continue:
Our findings on the effects of growth and unemployment provide evidence against supply-side theories that suggest lower taxes on the rich will induce labour supply responses from high-income individuals (more hours of work, more effort etc.) that boost economic activity. They are, in fact, more in line with recent empirical research showing that income tax holidays and windfall gains do not lead individuals to significantly alter the amount they work.
Our results have important implications for current debates around the economic consequences of taxing the rich, as they provide causal evidence that supports the growing pool of evidence from correlational studies that cutting taxes on the rich increases top income shares, but has little effect on economic performance.
Limberg is hopeful that the research could bolster the case for increasing taxes on the wealthy to fund a just recovery from the coronavirus pandemic and ensuing economic fallout.
“Our results,” he said Wednesday, “might be welcome news for governments as they seek to repair the public finances after the Covid-19 crisis, as they imply that they should not be unduly concerned about the economic consequences of higher taxes on the rich.”
Progressives have argued that America’s disastrous handling of the ongoing catastrophe is attributable to several decades of “free-market” ideology and associated policies that exacerbated vulnerabilities and undermined the government’s capacity to respond effectively.
According to social justice advocates, taxing billionaires’ surging wealth—akin to the “Millionaire’s Tax” passed earlier this month in Argentina—could contribute to reversing the trend of intensifying inequality plaguing the nation.
“Cutting taxes on the rich increases top income shares, but has little effect on economic performance.”
—David Hope and Julian Limberg
The Economic Consequences of Major Tax Cuts for the Rich (pdf), a working paper published this month by the International Inequalities Institute at the London School of Economics and written by LSE’s David Hope and Julian Limberg of King’s College London, examines data from nearly 20 OECD countries, including the U.K. and the U.S., and finds that the past five decades have been characterized by “falling taxes on the rich in the advanced economies,” with “major tax cuts… particularly clustered in the late 1980s.”
But, according to Hope and Limberg, the vast majority of the populations in those countries have little to show for it, as the benefits of slashing taxes on the wealthy are concentrated among a handful of super-rich individuals—not widely shared across society in the form of improved job creation or prosperity, as “trickle down” theorists alleged would happen.
“Our research shows that the economic case for keeping taxes on the rich low is weak,” Hope said Wednesday. “Major tax cuts for the rich since the 1980s have increased income inequality, with all the problems that brings, without any offsetting gains in economic performance.”
In their study, the pair of political economists note that “economic performance, as measured by real GDP per capita and the unemployment rate, is not significantly affected by major tax cuts for the rich.” However, they add, “major tax cuts for the rich increase the top 1% share of pre-tax national income in the years following the reform” by a magnitude of nearly 1%.
The researchers continue:
Our findings on the effects of growth and unemployment provide evidence against supply-side theories that suggest lower taxes on the rich will induce labour supply responses from high-income individuals (more hours of work, more effort etc.) that boost economic activity. They are, in fact, more in line with recent empirical research showing that income tax holidays and windfall gains do not lead individuals to significantly alter the amount they work.
Our results have important implications for current debates around the economic consequences of taxing the rich, as they provide causal evidence that supports the growing pool of evidence from correlational studies that cutting taxes on the rich increases top income shares, but has little effect on economic performance.
Limberg is hopeful that the research could bolster the case for increasing taxes on the wealthy to fund a just recovery from the coronavirus pandemic and ensuing economic fallout.
“Our results,” he said Wednesday, “might be welcome news for governments as they seek to repair the public finances after the Covid-19 crisis, as they imply that they should not be unduly concerned about the economic consequences of higher taxes on the rich.”
Progressives have argued that America’s disastrous handling of the ongoing catastrophe is attributable to several decades of “free-market” ideology and associated policies that exacerbated vulnerabilities and undermined the government’s capacity to respond effectively.
According to social justice advocates, taxing billionaires’ surging wealth—akin to the “Millionaire’s Tax” passed earlier this month in Argentina—could contribute to reversing the trend of intensifying inequality plaguing the nation.
US income inequality
To reverse inequality, we need to expose the myth of the ‘free market’
Robert Reich - THE GUARDIAN
12/9/2020
We need an informed public that sees through the poisonous myth billionaires want us to believe: that income is a measure of your market worth
How have a relative handful of billionaires – whose vast fortunes have soared even during the pandemic – convinced the vast majority of the public that their wealth shouldn’t be taxed in order to support the common good?
They have employed one of the oldest methods used by the wealthy to maintain wealth and power – a belief system that portrays wealth and power in the hands of a few as natural and inevitable.
Centuries ago it was the so-called “divine right of kings”. King James I of England and France’s Louis XIV, among other monarchs, asserted that kings received their authority from God and were therefore not accountable to their earthly subjects. The doctrine ended with England’s Glorious Revolution of the 17th century and the American and French revolutions of the 18th.
Its modern equivalent might be termed “market fundamentalism”, a creed that has been promoted by today’s super rich with no less zeal than the old aristocracy advanced divine right. It holds that what you’re paid is simply a measure of what you’re worth in the market.
If you amass a billion dollars you must deserve it because the market has awarded you that much. If you barely scrape by you have only yourself to blame. If millions of people are unemployed or their paychecks are shrinking or they have to work two or three jobs and have no idea what they’ll be earning next month or even next week, that’s unfortunate but it’s the outcome of market forces.
Few ideas have more profoundly poisoned the minds of more people than the notion of a “free market” existing somewhere in the universe, into which government “intrudes”. According to this view, whatever we might do to reduce inequality or economic insecurity – to make the economy work for most of us – runs the risk of distorting the market and causing it to be less efficient, or of unintended consequences that may end up harming us. The “free market” is to be preferred over “government”.
This prevailing view is utterly false. There can be no “free market” without government. A market – any market – requires government to make and enforce the rules of the game. In most modern democracies, such rules emanate from legislatures, administrative agencies and courts. Government doesn’t “intrude” on the “free market”. It creates and maintains the market.
Market rules are neither neutral nor universal. They partly mirror a society’s evolving norms and values. But they also reflect who in society has the most power to make or influence the underlying market rules.
The interminable debate over whether the “free market” is better than “government” makes it impossible for us to examine who exercises this power, how they benefit from doing so and whether such rules need to be altered so that more people benefit from them. The myth of market fundamentalism is therefore highly useful to those who do not wish such an examination to be undertaken.
It’s no accident that those with disproportionate influence over the rules of the market – who are the largest beneficiaries of how the rules have been designed and adapted – are also among the most vehement supporters of the “free market”, and the most ardent advocates of the relative superiority of the market over government.
The debate over market v government serves to distract the public from the underlying realities of how the rules are generated and changed, from the power of the moneyed interests over this process, and the extent to which they gain from the results. In other words, not only do these “free market” advocates want the public to agree with them about the superiority of the market, but also about the central importance of the interminable and distracting debate over whether the market or the government should prevail.
This is why it’s so important to expose the underlying structure of the so-called “free market” and show how and where power is exercised over it.
It’s why I write a weekly column for the Guardian – one of the few publications in the world committed to revealing the truth about the economy and exposing the myths that distract the public’s attention from what is really going on. The Guardian can do this because it’s not financed by commercial sponsors or any party with a financial or other interest in what it reports but exists solely to serve the public.
Inequalities of income, wealth and political power continue to widen across all advanced economies. This doesn’t have to be the case. But to reverse them, we need an informed public capable of seeing through the mythologies that protect and preserve today’s super-rich no less than did the Divine Right of Kings centuries ago.
How have a relative handful of billionaires – whose vast fortunes have soared even during the pandemic – convinced the vast majority of the public that their wealth shouldn’t be taxed in order to support the common good?
They have employed one of the oldest methods used by the wealthy to maintain wealth and power – a belief system that portrays wealth and power in the hands of a few as natural and inevitable.
Centuries ago it was the so-called “divine right of kings”. King James I of England and France’s Louis XIV, among other monarchs, asserted that kings received their authority from God and were therefore not accountable to their earthly subjects. The doctrine ended with England’s Glorious Revolution of the 17th century and the American and French revolutions of the 18th.
Its modern equivalent might be termed “market fundamentalism”, a creed that has been promoted by today’s super rich with no less zeal than the old aristocracy advanced divine right. It holds that what you’re paid is simply a measure of what you’re worth in the market.
If you amass a billion dollars you must deserve it because the market has awarded you that much. If you barely scrape by you have only yourself to blame. If millions of people are unemployed or their paychecks are shrinking or they have to work two or three jobs and have no idea what they’ll be earning next month or even next week, that’s unfortunate but it’s the outcome of market forces.
Few ideas have more profoundly poisoned the minds of more people than the notion of a “free market” existing somewhere in the universe, into which government “intrudes”. According to this view, whatever we might do to reduce inequality or economic insecurity – to make the economy work for most of us – runs the risk of distorting the market and causing it to be less efficient, or of unintended consequences that may end up harming us. The “free market” is to be preferred over “government”.
This prevailing view is utterly false. There can be no “free market” without government. A market – any market – requires government to make and enforce the rules of the game. In most modern democracies, such rules emanate from legislatures, administrative agencies and courts. Government doesn’t “intrude” on the “free market”. It creates and maintains the market.
Market rules are neither neutral nor universal. They partly mirror a society’s evolving norms and values. But they also reflect who in society has the most power to make or influence the underlying market rules.
The interminable debate over whether the “free market” is better than “government” makes it impossible for us to examine who exercises this power, how they benefit from doing so and whether such rules need to be altered so that more people benefit from them. The myth of market fundamentalism is therefore highly useful to those who do not wish such an examination to be undertaken.
It’s no accident that those with disproportionate influence over the rules of the market – who are the largest beneficiaries of how the rules have been designed and adapted – are also among the most vehement supporters of the “free market”, and the most ardent advocates of the relative superiority of the market over government.
The debate over market v government serves to distract the public from the underlying realities of how the rules are generated and changed, from the power of the moneyed interests over this process, and the extent to which they gain from the results. In other words, not only do these “free market” advocates want the public to agree with them about the superiority of the market, but also about the central importance of the interminable and distracting debate over whether the market or the government should prevail.
This is why it’s so important to expose the underlying structure of the so-called “free market” and show how and where power is exercised over it.
It’s why I write a weekly column for the Guardian – one of the few publications in the world committed to revealing the truth about the economy and exposing the myths that distract the public’s attention from what is really going on. The Guardian can do this because it’s not financed by commercial sponsors or any party with a financial or other interest in what it reports but exists solely to serve the public.
Inequalities of income, wealth and political power continue to widen across all advanced economies. This doesn’t have to be the case. But to reverse them, we need an informed public capable of seeing through the mythologies that protect and preserve today’s super-rich no less than did the Divine Right of Kings centuries ago.
CORONAVIRUS
WHEN FOOLS & THIEVES ARE IN CHARGE, GUESS WHAT!!!
Rapid Testing Is Less Accurate Than the Government Wants to Admit
Rapid antigen testing is a mess. The federal government pushed it out without a plan, and then spent weeks denying problems with false positives.
by Lisa Song - PRO PUBLICA
Nov. 16, 5 a.m. EST
The promise of antigen tests emerged like a miracle this summer. With repeated use, the theory went, these rapid and cheap coronavirus tests would identify highly infectious people while giving healthy Americans a green light to return to offices, schools and restaurants. The idea of on-the-spot tests with near-instant results was an appealing alternative to the slow, lab-based testing that couldn’t meet public demand.
By September, the U.S. Department of Health and Human Services had purchased more than 150 million tests for nursing homes and schools, spending more than $760 million. But it soon became clear that antigen testing — named for the viral proteins, or antigens, that the test detects — posed a new set of problems. Unlike lab-based, molecular PCR tests, which detect snippets of the virus’s genetic material, antigen tests are less sensitive because they can only detect samples with a higher viral load. The tests were prone to more false negatives and false positives. As problems emerged, officials were slow to acknowledge the evidence.
With the benefit of hindsight, experts said the Trump administration should have released antigen tests primarily to communities with outbreaks instead of expecting them to work just as well in large groups of asymptomatic people. Understanding they can produce false results, the government could have ensured that clinics had enough for repeat testing to reduce false negatives and access to more precise PCR tests to weed out false positives. Government agencies, which were aware of the tests’ limitations, could have built up trust by being more transparent about them and how to interpret results, scientists said.
When health care workers in Nevada and Vermont reported false positives, HHS defended the tests and threatened Nevada with unspecified sanctions until state officials agreed to continue using them in nursing homes. It took several more weeks for the U.S. Food and Drug Administration to issue an alert on Nov. 3 that confirmed what Nevada had experienced: Antigen tests were prone to giving false positives, the FDA warned.
“Part of the problem is this administration has continuously played catch-up,” said Dr. Abraar Karan, a physician at Harvard Medical School. It was criticized for not ensuring enough PCR tests at the beginning, and when antigen tests became available, it shoved them at the states without a coordinated plan, he said.
If you tested the same group of people once a week without fail, with adequate double-checking, then a positive test could be the canary in the coal mine, said Dr. Mark Levine, commissioner of Vermont’s Health Department. “Unfortunately the government didn’t really advertise it that way or prescribe it” with much clarity, so some people lost faith.
HHS and the FDA did not respond to requests for comment.
The scientific community remains divided on the potential of antigen tests.
Epidemic control is the main argument for antigen testing. A string of studies show that antigen tests reliably detect high viral loads. Because people are most infectious when they have high viral loads, the tests will flag those most likely to infect others. Modeling also shows how frequent, repeated antigen testing may be better at preventing outbreaks than highly sensitive PCR tests, if those tests are used infrequently and require long wait times for results. So far, there are no large scale, peer-reviewed studies showing how the antigen approach has curbed outbreaks on the ground.
People need to realize that without rapid testing, we’re living in a world where many people are unknowingly becoming superspreaders, Karan said. About 40% of infections are spread by asymptomatic people with high viral loads, so antigen tests, however imperfect, shouldn’t be dismissed, he said.
Even those who are more skeptical said they can be helpful with a targeted approach directed at lower-risk situations like schools, or outbreaks in rural communities where PCR is impractical, rather than nursing homes where a single mistake could set off a chain of deaths.
It is “completely irresponsible” to take a less-accurate test and say it applies to all situations, said Melissa Miller, director of the clinical microbiology lab at the University of North Carolina.
There’s no precedent for the government to bet this much on a product before it’s been thoroughly vetted, said Matthew Pettengill, scientific director of clinical microbiology at Thomas Jefferson University. “They put the cart before the horse, and we still can’t see the horse.”
The Government Quickly Embraced an Unproven Test
During a public health crisis, the FDA can issue emergency use authorizations to make tests available that might otherwise have been subjected to many months of scrutiny before being approved. The three most popular antigen tests in the U.S., from Abbott Laboratories, Quidel and Becton, Dickinson, commonly known as BD, had to submit far less proof of success than is usually required.
FDA gave the first authorization to Quidel on May 8 based on data from 209 positive and negative samples. BD got its permit July 2 with a total of 226 samples and Abbott in late August with 102. Outside of a pandemic, the agency might otherwise have required hundreds more samples; in 2018, BD’s antigen test for the flu provided data on 736 samples.
There’s no excuse for the small pool of data, particularly for Abbott, Pettengill said. At the start of the pandemic, the FDA authorized PCR tests based on as few as 60 samples because it was difficult to find confirmed cases. By the time Abbott got its authorization in August, it was “a completely different ballgame.” Abbott’s validation document states the company collected swabs from patients at seven sites. Given the case counts over the summer, it should have only taken a few days to collect many hundreds of samples, Pettengill said.
Abbott didn’t respond to requests for comment. Quidel pointed ProPublica to an article in The New England Journal of Medicine that explained how regular antigen testing can contain the pandemic by identifying those who are most infectious.
“We have full confidence in the performance” of our test, Kristen Cardillo, BD’s vice president of global communication, said in an email. BD “completed one of the most geographically broad” clinical trials for any antigen test on the market, she added, by “collecting and analyzing 226 samples from 21 different clinical trial sites across 11 states.”
The day after the Abbott test was authorized, HHS placed a huge bet on it, buying 150 million tests.
Then, it gave institutions like nursing homes advice on how to use them off-label, in a way in which they were untested and unproven.
The three tests are authorized for the most straightforward cases: people with COVID-19 symptoms in the first week of symptoms. That’s how they were validated. They produced virtually no false positives that way and were 84% to 97% as sensitive as lab tests, meaning they caught that range of the samples deemed positive by PCR.
Yet HHS allowed their use for large-scale asymptomatic screening without fully exploring the consequences, Pettengill said.
A recent study, not yet peer reviewed, found the Quidel test detected over 80% of cases when used on symptomatic people and those with known exposures to the virus, but only 32% among people without symptoms, The New York Times reported. [...]
READ MORE
RELATED: Chief adviser on Operation Warp Speed says ‘the president has never been actively involved’ in COVID-19 vaccine
By September, the U.S. Department of Health and Human Services had purchased more than 150 million tests for nursing homes and schools, spending more than $760 million. But it soon became clear that antigen testing — named for the viral proteins, or antigens, that the test detects — posed a new set of problems. Unlike lab-based, molecular PCR tests, which detect snippets of the virus’s genetic material, antigen tests are less sensitive because they can only detect samples with a higher viral load. The tests were prone to more false negatives and false positives. As problems emerged, officials were slow to acknowledge the evidence.
With the benefit of hindsight, experts said the Trump administration should have released antigen tests primarily to communities with outbreaks instead of expecting them to work just as well in large groups of asymptomatic people. Understanding they can produce false results, the government could have ensured that clinics had enough for repeat testing to reduce false negatives and access to more precise PCR tests to weed out false positives. Government agencies, which were aware of the tests’ limitations, could have built up trust by being more transparent about them and how to interpret results, scientists said.
When health care workers in Nevada and Vermont reported false positives, HHS defended the tests and threatened Nevada with unspecified sanctions until state officials agreed to continue using them in nursing homes. It took several more weeks for the U.S. Food and Drug Administration to issue an alert on Nov. 3 that confirmed what Nevada had experienced: Antigen tests were prone to giving false positives, the FDA warned.
“Part of the problem is this administration has continuously played catch-up,” said Dr. Abraar Karan, a physician at Harvard Medical School. It was criticized for not ensuring enough PCR tests at the beginning, and when antigen tests became available, it shoved them at the states without a coordinated plan, he said.
If you tested the same group of people once a week without fail, with adequate double-checking, then a positive test could be the canary in the coal mine, said Dr. Mark Levine, commissioner of Vermont’s Health Department. “Unfortunately the government didn’t really advertise it that way or prescribe it” with much clarity, so some people lost faith.
HHS and the FDA did not respond to requests for comment.
The scientific community remains divided on the potential of antigen tests.
Epidemic control is the main argument for antigen testing. A string of studies show that antigen tests reliably detect high viral loads. Because people are most infectious when they have high viral loads, the tests will flag those most likely to infect others. Modeling also shows how frequent, repeated antigen testing may be better at preventing outbreaks than highly sensitive PCR tests, if those tests are used infrequently and require long wait times for results. So far, there are no large scale, peer-reviewed studies showing how the antigen approach has curbed outbreaks on the ground.
People need to realize that without rapid testing, we’re living in a world where many people are unknowingly becoming superspreaders, Karan said. About 40% of infections are spread by asymptomatic people with high viral loads, so antigen tests, however imperfect, shouldn’t be dismissed, he said.
Even those who are more skeptical said they can be helpful with a targeted approach directed at lower-risk situations like schools, or outbreaks in rural communities where PCR is impractical, rather than nursing homes where a single mistake could set off a chain of deaths.
It is “completely irresponsible” to take a less-accurate test and say it applies to all situations, said Melissa Miller, director of the clinical microbiology lab at the University of North Carolina.
There’s no precedent for the government to bet this much on a product before it’s been thoroughly vetted, said Matthew Pettengill, scientific director of clinical microbiology at Thomas Jefferson University. “They put the cart before the horse, and we still can’t see the horse.”
The Government Quickly Embraced an Unproven Test
During a public health crisis, the FDA can issue emergency use authorizations to make tests available that might otherwise have been subjected to many months of scrutiny before being approved. The three most popular antigen tests in the U.S., from Abbott Laboratories, Quidel and Becton, Dickinson, commonly known as BD, had to submit far less proof of success than is usually required.
FDA gave the first authorization to Quidel on May 8 based on data from 209 positive and negative samples. BD got its permit July 2 with a total of 226 samples and Abbott in late August with 102. Outside of a pandemic, the agency might otherwise have required hundreds more samples; in 2018, BD’s antigen test for the flu provided data on 736 samples.
There’s no excuse for the small pool of data, particularly for Abbott, Pettengill said. At the start of the pandemic, the FDA authorized PCR tests based on as few as 60 samples because it was difficult to find confirmed cases. By the time Abbott got its authorization in August, it was “a completely different ballgame.” Abbott’s validation document states the company collected swabs from patients at seven sites. Given the case counts over the summer, it should have only taken a few days to collect many hundreds of samples, Pettengill said.
Abbott didn’t respond to requests for comment. Quidel pointed ProPublica to an article in The New England Journal of Medicine that explained how regular antigen testing can contain the pandemic by identifying those who are most infectious.
“We have full confidence in the performance” of our test, Kristen Cardillo, BD’s vice president of global communication, said in an email. BD “completed one of the most geographically broad” clinical trials for any antigen test on the market, she added, by “collecting and analyzing 226 samples from 21 different clinical trial sites across 11 states.”
The day after the Abbott test was authorized, HHS placed a huge bet on it, buying 150 million tests.
Then, it gave institutions like nursing homes advice on how to use them off-label, in a way in which they were untested and unproven.
The three tests are authorized for the most straightforward cases: people with COVID-19 symptoms in the first week of symptoms. That’s how they were validated. They produced virtually no false positives that way and were 84% to 97% as sensitive as lab tests, meaning they caught that range of the samples deemed positive by PCR.
Yet HHS allowed their use for large-scale asymptomatic screening without fully exploring the consequences, Pettengill said.
A recent study, not yet peer reviewed, found the Quidel test detected over 80% of cases when used on symptomatic people and those with known exposures to the virus, but only 32% among people without symptoms, The New York Times reported. [...]
READ MORE
RELATED: Chief adviser on Operation Warp Speed says ‘the president has never been actively involved’ in COVID-19 vaccine
Trump’s Wildly Exaggerated Help For Black Voters
It’s Definitely Not Urban Blacks Who Benefit Most From Federal Program That Creates Very Few Jobs
Terry H. Schwadron - dc report
10/13/2020
As part of every campaign speech – and in that single, awful debate – Donald Trump refers to helping black voters. Specifically, he cites lower unemployment numbers than in the years before his presidency and the introduction of opportunity zones for investment in specified urban neighborhoods.
Actually, when you look at these opportunity zones, the investments are spotty, the job gains are negligible and they serve primarily as write-off opportunities for wealthy investors to avoid federal capital gains taxes.
A Politico examination of the program found there is less in these vaunted programs than promoted by Trump looking for ways to show he has been the most beneficial sponsor of programs for the black community since Abraham Lincoln. That’s his urban mantra.
Trump claims this anti-poverty program has attracted “$100 billion of new investment . . . into 9,000 of our most distressed neighborhoods” and created “countless jobs.”
Now even the White House Council of Economic Advisers says the number is closer to $75 billion in private investment since December 2017. Yet independent sources told Politico it is between 2½ and seven times too high. They said there have been investments between $10 million and $30 million. Since the program was set up with no reporting rules, there is no way to know.
The 60 Minutes television program talked with owners of benefitting businesses, who said the claims of added jobs are figments of presidential imagination. They said existing businesses have been helped, but to the tune of adding say 10s of jobs each.
In other words, the Trump claims are not wrong – just way overstated.
Attracting investors
Opportunity zones were created in December 2017 to attract investors, giving them the chance for tax avoidance if they re-invested in funds concentrating on businesses in or abutting Census tracts designated as high- or low-poverty areas.
Politico found that to date, these opportunity zones mostly benefited neighborhoods already on the upswing and middle-class renters. The opportunity zone program has no job guarantees and no mechanism requiring projects to benefit any poor person. Indeed, much of the money has gone toward building luxury apartments, hotels and office towers – showing that the main beneficiaries are, well, wealthy rather than poor.
A recent study suggests the opportunity zones have actually attracted slightly fewer new jobs than areas that were eligible for the zone program, but not selected for it. The Economic Innovation Group looked at Cleveland, which is home to half of the publicly announced opportunity zone projects in Ohio. Interviews with the developers show that opportunity zone funding was not essential to making the projects happen, since they were already active.
One example cited was Kevin Wojton, who bought an abandoned building to convert it into a rock-climbing gym, yoga studio and tech nonprofit with a bank loan and now his own investment of $100,000 for which he gets a tax write-off. Total job impact: 50 to 75 temporary construction jobs and 15 to 25 permanent jobs in an area now starting to boom. It has two public housing projects nearby, qualifying it.
In other words, who’s benefiting most? Real estate developers like Donald Trump win.
Election fodder
Joe Biden has criticized opportunity zones for subsidizing too many “high-return projects, like luxury apartments.” He’s proposed including incentives for investors and developers to work with community organizations to build projects with social benefits.
More affordable housing would be one such focus, and there are examples of investors using the program to build apartments with variable rents, but mostly housing for middle-class tenants rather than the poor.
Would these have come about without the opportunity zones? Experts say philanthropic funds and participation by important neighbors like hospitals or a good-sized employer may have done so as well. The issue seems to be to get the attention of investors – which is where the tax incentives come in.
Ned Hill, an economic development professor at Ohio State University told Politico that neither the climbing gym apartments for medical residents will “affect the lives of low-income people in any major way.” Yet city halls and chambers of commerce praise the zones, Hill says, because they can add to old industrial cities’ tax bases by helping some projects “become bankable.”
Tax credits don’t assure that poor areas will become economically viable, but they may extend the boundaries of adjoining built-up areas.
Housing and real development in economically deprived areas will require more direct government intervention – you know, just the kind of thing that Trump calls socialism.
Until then, take the claims of success with opportunity zones with a huge grain of salt.
Actually, when you look at these opportunity zones, the investments are spotty, the job gains are negligible and they serve primarily as write-off opportunities for wealthy investors to avoid federal capital gains taxes.
A Politico examination of the program found there is less in these vaunted programs than promoted by Trump looking for ways to show he has been the most beneficial sponsor of programs for the black community since Abraham Lincoln. That’s his urban mantra.
Trump claims this anti-poverty program has attracted “$100 billion of new investment . . . into 9,000 of our most distressed neighborhoods” and created “countless jobs.”
Now even the White House Council of Economic Advisers says the number is closer to $75 billion in private investment since December 2017. Yet independent sources told Politico it is between 2½ and seven times too high. They said there have been investments between $10 million and $30 million. Since the program was set up with no reporting rules, there is no way to know.
The 60 Minutes television program talked with owners of benefitting businesses, who said the claims of added jobs are figments of presidential imagination. They said existing businesses have been helped, but to the tune of adding say 10s of jobs each.
In other words, the Trump claims are not wrong – just way overstated.
Attracting investors
Opportunity zones were created in December 2017 to attract investors, giving them the chance for tax avoidance if they re-invested in funds concentrating on businesses in or abutting Census tracts designated as high- or low-poverty areas.
Politico found that to date, these opportunity zones mostly benefited neighborhoods already on the upswing and middle-class renters. The opportunity zone program has no job guarantees and no mechanism requiring projects to benefit any poor person. Indeed, much of the money has gone toward building luxury apartments, hotels and office towers – showing that the main beneficiaries are, well, wealthy rather than poor.
A recent study suggests the opportunity zones have actually attracted slightly fewer new jobs than areas that were eligible for the zone program, but not selected for it. The Economic Innovation Group looked at Cleveland, which is home to half of the publicly announced opportunity zone projects in Ohio. Interviews with the developers show that opportunity zone funding was not essential to making the projects happen, since they were already active.
One example cited was Kevin Wojton, who bought an abandoned building to convert it into a rock-climbing gym, yoga studio and tech nonprofit with a bank loan and now his own investment of $100,000 for which he gets a tax write-off. Total job impact: 50 to 75 temporary construction jobs and 15 to 25 permanent jobs in an area now starting to boom. It has two public housing projects nearby, qualifying it.
In other words, who’s benefiting most? Real estate developers like Donald Trump win.
Election fodder
Joe Biden has criticized opportunity zones for subsidizing too many “high-return projects, like luxury apartments.” He’s proposed including incentives for investors and developers to work with community organizations to build projects with social benefits.
More affordable housing would be one such focus, and there are examples of investors using the program to build apartments with variable rents, but mostly housing for middle-class tenants rather than the poor.
Would these have come about without the opportunity zones? Experts say philanthropic funds and participation by important neighbors like hospitals or a good-sized employer may have done so as well. The issue seems to be to get the attention of investors – which is where the tax incentives come in.
Ned Hill, an economic development professor at Ohio State University told Politico that neither the climbing gym apartments for medical residents will “affect the lives of low-income people in any major way.” Yet city halls and chambers of commerce praise the zones, Hill says, because they can add to old industrial cities’ tax bases by helping some projects “become bankable.”
Tax credits don’t assure that poor areas will become economically viable, but they may extend the boundaries of adjoining built-up areas.
Housing and real development in economically deprived areas will require more direct government intervention – you know, just the kind of thing that Trump calls socialism.
Until then, take the claims of success with opportunity zones with a huge grain of salt.
america's plantations!!!
Powerless Farmworkers Suffer Under Trump’s Anti-Migrant Policies
Labor Department Won’t Stop Corporate Abuses of ‘Essential’ Laborers; High COVID Rates, Wage Cuts, Squalid Housing
By Joe Maniscalco - dc report
10/2/2020
Donald Trump is making it easier for U.S. growers to profit off a system of “government-sanctioned human trafficking,” a practice that also appears to be worsening the pandemic.
At issue is a federal visa program called H-2A that allows foreign workers to manage and pick crops, low paid work few Americans are willing to do.
Farmworker advocates call the H-2A visa program “intrinsically abusive.” They characterize it as a scheme that ties migrant workers to the whims of a single employer.
“The program is structured in such a way that there is a huge power imbalance that favors employers versus workers,” Maria Perales Sanchez of Centro de los Derechos del Migrante Inc. (CDM, Center for Migrant Rights) told me. CDM spent five months between September 2019 and January 2020 interviewing H-2A workers across Mexico to understand their conditions. It found systemic violations of basic worker rights.
Universal Abuse
“One hundred percent of those surveyed reported at least one illegal abuse of their rights — 96% of them reported three or more. Abuses accumulate and put workers at even more risk,” Perales Sanchez says.
The farmworkers worked largely in Florida, but also Georgia, Washington and North Carolina. They reported violent attacks, verbal abuse, and sexual harassment.
“We have seen the Trump Administration consistently fail to enact enforceable protections and standards for agricultural workers, and specifically H-2A workers,” Perales Sanchez said.
Now the Trump Administration is changing the rules in ways that will make migrant farmworkers even more vulnerable to the worst employers.
“The H-2A program is easing regulations for employers, making it easier for them to have access to workers,” Perales Sanchez said. “We’re not seeing workers’ rights and protections. In the context of COVID, this puts them at greater risk. There’re no enforceable protections.” That means the risk of workers contracting the COVID-19 and spreading it increases.
Lawsuit Filed
CDM together with the Michigan Immigrant Rights Center and Farmworker Legal Services of Michigan helped six H-2A workers from Mexico sue Four Star Greehouse Inc., and its president Thomas Smith, for unpaid wages, knowingly benefiting from labor trafficking and retaliation against those who complained. Four Star is one of the largest greenhouses in Michigan; it sells plants under its “Proven Winners” brand name.
The company’s lawyer, Michael Stroster, issued a statement denying any problems.
“We take the allegations made by former contracted workers very seriously and find it particularly disturbing that anyone would allege that Four Star withheld payments, threatened contracted workers, attempted or was involved in any way with deporting any individual who worked at its facility. That is simply not true. We are extremely confident that Four Star acted appropriately and lawfully at all times and that the allegations levied against our company will be dismissed in their entirety.”
The company said it is “dedicated to complying with applicable laws, rules, and regulations and ensuring a safe workplace for all.”
Endure or Leave
Ben Botts, the CDM legal director, noted that migrant farmworkers in the H-2A program who are hired by abusive employers have the “option to endure — or leave the country.”
“The trafficking law as written doesn’t have to include physical force — actually, it’s a lot more widespread than that. Workers are vulnerable — [the bosses] use the coercive power they know they have over workers.”
Last year, the Department of Labor approved 257,000 H-2A visas, according to Farmworker Justice, a non-profit organization fighting on behalf of seasonal and migrant farmworkers.
Overall, there are 2.5 million farmworkers in the U.S. That doesn’t count poultry and meatpacking plants. The overwhelming majority — 83% — are immigrants. Half are undocumented.
Anemic labor protections mean farmworkers — and especially those in the H-2A program – are forced to live in cramped housing conditions that make them exceedingly vulnerable to COVID-19.
“We’re seeing this played out in a few different ways,” Emma Kreyche, a senior worker rights advocate at the Worker Justice Center of New York, told me.
Fake Quarantine
“There are some guidelines in place for placing H-2A workers that are arriving in New York from other countries into a kind of precautionary quarantine,” she said. “However, this isn’t a true quarantine in the sense that we’re accustomed thinking about it because it really just calls for workers who arrive together to stay together in housing that is separate from other workers.”
The efficacy of the quasi-quarantine practice is dubious, Kreyche said.
“We have seen mixed results and mixed reports,” Kreyche says. “One concern we have — because, again, workers are being housed together, if there’s one worker contaminated in a group of 20 or 40, who have traveled together — then the likelihood of infection spreading within that group is unmitigated by this practice. Secondly, we have heard reports where these workers are, in fact, being housed with other individuals who have been on the farm for longer periods of time. So, you might have several people housed together in a trailer and there are new people coming and going without much clear organization or sense of quarantine.”
Other concerns revolve around poorly communicated safety protocols and the lack of testing.
“There isn’t available testing for the most part for workers that are asymptomatic,” Kreyche adds. “So, until we have some kind of testing regimen for seasonal workers that are coming onto a farm, it’s really hard to protect them or their coworkers. In most rural counties, if you don’t have transportation, if you don’t have health insurance, it can be very difficult to get tested if you’re asymptomatic.”
The Food & Environment Reporting Network puts the number of COVID-19 positive farmworkers at 7,718 nationwide with nearly 20 fatalities.
Administration Does Nothing to Help
So far, the Trump Administration has ignored calls for reforms to protect the health of workers.
Instead, Trump & Co. accelerated the H-2A visa program by relaxing rules meant to give preferences to U.S. workers, loosening housing, and other labor protections. That paved the way for deep wage cuts for already poorly paid migrant farmworkers.
“The Trump Administration has gone out of its way to make it easy for agricultural employers to bring in guest workers during the pandemic,” Bruce Goldstein president of Farmworker Justice, told DCReport. “They’ve eased certain processes. And yet the Trump administration has refused to require safety protections during transportation to the U.S. in housing or in the workplace.”
The U.S. Department of Labor under Trump-appointed Labor Secretary Eugene Scalia, openly rebuffed farmworker advocates and their calls for job safety requirements over the summer and, instead, touted the issuance of COVID-19-related “guidance,” saying, in part, “State and local governments are best suited to know and understand the pandemic and appropriate response at their level.”
Whitney Ford, head of the Labor Department’s Division of Immigration and Farm Labor, sent Goldstein a dismissive letter typical of the attitude of the Trump Administration toward workers, especially workers of color.
“You suggested numerous enhancements to the H-2A program to address safety and health concerns related to housing, transpiration, social distancing at work, paid medical treatment, and specific disclosure requirements related to COVID-19,” Ford wrote. “Many of these suggestions are beyond the scope of the Department’s authority. The Department will continue to closely monitor the COVID-19 pandemic as it relates to farmworkers, issue guidance and promulgate regulations as appropriate if Congress enacts new legislation.”
Anti-Labor Appointments
Scalia, an anti-labor lawyer and the son of the late right-wing Supreme Court Justice Antonin Scalia, became Labor Secretary in 2019. Trump named him after Alexander Acosta resigned in disgrace over a sweetheart deal that he gave child sex trafficker Jeffrey Epstein in 2008, when Acosta was U.S. Attorney from the Southern District of Florida.
Prior to Acosta, Trump had tried and failed, to install fast-food CEO and virulently anti-worker advocate Andy Puzder to head of the Labor Department in 2017. Puzder ultimately took himself out of contention after allegations of wife-beating surfaced.
This week, the U.S. Department of Agriculture canceled the Farm Labor Survey of agricultural employers — a metric the Department of Labor uses to set the main minimum wage under the H-2A agricultural guest worker program.
Without that survey, farmworker advocates say both U.S. agricultural workers and migrant farmworkers enrolled in the H-2A guest worker program are in for massive wage cuts.
Juan Antonio Zuniga is a 68-year-old farmworker who puts in 10-hour days, six-days-a-week and somehow manages to help the Rural and Migrant Ministry in New York State organize other vulnerable workers. The El Salvadoran emigre told me that he would love to help organize H-2A workers, too, but that they are hard to access. His son, a construction worker, has COVID. He also knows many farmworkers, who have gotten sick with the coronavirus.
“The farmworkers next door have gotten sick,” Zuniga said through an interpreter. “They asked for masks but were told by the boss — ‘There is the door.’”
At issue is a federal visa program called H-2A that allows foreign workers to manage and pick crops, low paid work few Americans are willing to do.
Farmworker advocates call the H-2A visa program “intrinsically abusive.” They characterize it as a scheme that ties migrant workers to the whims of a single employer.
“The program is structured in such a way that there is a huge power imbalance that favors employers versus workers,” Maria Perales Sanchez of Centro de los Derechos del Migrante Inc. (CDM, Center for Migrant Rights) told me. CDM spent five months between September 2019 and January 2020 interviewing H-2A workers across Mexico to understand their conditions. It found systemic violations of basic worker rights.
Universal Abuse
“One hundred percent of those surveyed reported at least one illegal abuse of their rights — 96% of them reported three or more. Abuses accumulate and put workers at even more risk,” Perales Sanchez says.
The farmworkers worked largely in Florida, but also Georgia, Washington and North Carolina. They reported violent attacks, verbal abuse, and sexual harassment.
“We have seen the Trump Administration consistently fail to enact enforceable protections and standards for agricultural workers, and specifically H-2A workers,” Perales Sanchez said.
Now the Trump Administration is changing the rules in ways that will make migrant farmworkers even more vulnerable to the worst employers.
“The H-2A program is easing regulations for employers, making it easier for them to have access to workers,” Perales Sanchez said. “We’re not seeing workers’ rights and protections. In the context of COVID, this puts them at greater risk. There’re no enforceable protections.” That means the risk of workers contracting the COVID-19 and spreading it increases.
Lawsuit Filed
CDM together with the Michigan Immigrant Rights Center and Farmworker Legal Services of Michigan helped six H-2A workers from Mexico sue Four Star Greehouse Inc., and its president Thomas Smith, for unpaid wages, knowingly benefiting from labor trafficking and retaliation against those who complained. Four Star is one of the largest greenhouses in Michigan; it sells plants under its “Proven Winners” brand name.
The company’s lawyer, Michael Stroster, issued a statement denying any problems.
“We take the allegations made by former contracted workers very seriously and find it particularly disturbing that anyone would allege that Four Star withheld payments, threatened contracted workers, attempted or was involved in any way with deporting any individual who worked at its facility. That is simply not true. We are extremely confident that Four Star acted appropriately and lawfully at all times and that the allegations levied against our company will be dismissed in their entirety.”
The company said it is “dedicated to complying with applicable laws, rules, and regulations and ensuring a safe workplace for all.”
Endure or Leave
Ben Botts, the CDM legal director, noted that migrant farmworkers in the H-2A program who are hired by abusive employers have the “option to endure — or leave the country.”
“The trafficking law as written doesn’t have to include physical force — actually, it’s a lot more widespread than that. Workers are vulnerable — [the bosses] use the coercive power they know they have over workers.”
Last year, the Department of Labor approved 257,000 H-2A visas, according to Farmworker Justice, a non-profit organization fighting on behalf of seasonal and migrant farmworkers.
Overall, there are 2.5 million farmworkers in the U.S. That doesn’t count poultry and meatpacking plants. The overwhelming majority — 83% — are immigrants. Half are undocumented.
Anemic labor protections mean farmworkers — and especially those in the H-2A program – are forced to live in cramped housing conditions that make them exceedingly vulnerable to COVID-19.
“We’re seeing this played out in a few different ways,” Emma Kreyche, a senior worker rights advocate at the Worker Justice Center of New York, told me.
Fake Quarantine
“There are some guidelines in place for placing H-2A workers that are arriving in New York from other countries into a kind of precautionary quarantine,” she said. “However, this isn’t a true quarantine in the sense that we’re accustomed thinking about it because it really just calls for workers who arrive together to stay together in housing that is separate from other workers.”
The efficacy of the quasi-quarantine practice is dubious, Kreyche said.
“We have seen mixed results and mixed reports,” Kreyche says. “One concern we have — because, again, workers are being housed together, if there’s one worker contaminated in a group of 20 or 40, who have traveled together — then the likelihood of infection spreading within that group is unmitigated by this practice. Secondly, we have heard reports where these workers are, in fact, being housed with other individuals who have been on the farm for longer periods of time. So, you might have several people housed together in a trailer and there are new people coming and going without much clear organization or sense of quarantine.”
Other concerns revolve around poorly communicated safety protocols and the lack of testing.
“There isn’t available testing for the most part for workers that are asymptomatic,” Kreyche adds. “So, until we have some kind of testing regimen for seasonal workers that are coming onto a farm, it’s really hard to protect them or their coworkers. In most rural counties, if you don’t have transportation, if you don’t have health insurance, it can be very difficult to get tested if you’re asymptomatic.”
The Food & Environment Reporting Network puts the number of COVID-19 positive farmworkers at 7,718 nationwide with nearly 20 fatalities.
Administration Does Nothing to Help
So far, the Trump Administration has ignored calls for reforms to protect the health of workers.
Instead, Trump & Co. accelerated the H-2A visa program by relaxing rules meant to give preferences to U.S. workers, loosening housing, and other labor protections. That paved the way for deep wage cuts for already poorly paid migrant farmworkers.
“The Trump Administration has gone out of its way to make it easy for agricultural employers to bring in guest workers during the pandemic,” Bruce Goldstein president of Farmworker Justice, told DCReport. “They’ve eased certain processes. And yet the Trump administration has refused to require safety protections during transportation to the U.S. in housing or in the workplace.”
The U.S. Department of Labor under Trump-appointed Labor Secretary Eugene Scalia, openly rebuffed farmworker advocates and their calls for job safety requirements over the summer and, instead, touted the issuance of COVID-19-related “guidance,” saying, in part, “State and local governments are best suited to know and understand the pandemic and appropriate response at their level.”
Whitney Ford, head of the Labor Department’s Division of Immigration and Farm Labor, sent Goldstein a dismissive letter typical of the attitude of the Trump Administration toward workers, especially workers of color.
“You suggested numerous enhancements to the H-2A program to address safety and health concerns related to housing, transpiration, social distancing at work, paid medical treatment, and specific disclosure requirements related to COVID-19,” Ford wrote. “Many of these suggestions are beyond the scope of the Department’s authority. The Department will continue to closely monitor the COVID-19 pandemic as it relates to farmworkers, issue guidance and promulgate regulations as appropriate if Congress enacts new legislation.”
Anti-Labor Appointments
Scalia, an anti-labor lawyer and the son of the late right-wing Supreme Court Justice Antonin Scalia, became Labor Secretary in 2019. Trump named him after Alexander Acosta resigned in disgrace over a sweetheart deal that he gave child sex trafficker Jeffrey Epstein in 2008, when Acosta was U.S. Attorney from the Southern District of Florida.
Prior to Acosta, Trump had tried and failed, to install fast-food CEO and virulently anti-worker advocate Andy Puzder to head of the Labor Department in 2017. Puzder ultimately took himself out of contention after allegations of wife-beating surfaced.
This week, the U.S. Department of Agriculture canceled the Farm Labor Survey of agricultural employers — a metric the Department of Labor uses to set the main minimum wage under the H-2A agricultural guest worker program.
Without that survey, farmworker advocates say both U.S. agricultural workers and migrant farmworkers enrolled in the H-2A guest worker program are in for massive wage cuts.
Juan Antonio Zuniga is a 68-year-old farmworker who puts in 10-hour days, six-days-a-week and somehow manages to help the Rural and Migrant Ministry in New York State organize other vulnerable workers. The El Salvadoran emigre told me that he would love to help organize H-2A workers, too, but that they are hard to access. His son, a construction worker, has COVID. He also knows many farmworkers, who have gotten sick with the coronavirus.
“The farmworkers next door have gotten sick,” Zuniga said through an interpreter. “They asked for masks but were told by the boss — ‘There is the door.’”
Farmers Are Plagued by Debt and Climate Crisis. Trump Has Made Things Worse.
BY Leanna First-Arai, Truthout
PUBLISHED September 26, 2020
At a campaign rally at an airport in Wisconsin on September 17, President Trump announced a second round of COVID-related relief payments for farmers under the Coronavirus Food Assistance Program (CFAP 2). For the first time, producers of commodities, including wine grapes, goats and hemp, have become eligible for payments, which are expected to total $13 billion.
But many farmers say stopgap payments like this are a far cry from what they need to navigate intersecting crises, including the corporate consolidation of farms and the climate crisis, which have shaken the foundations of the brittle farm economy. Only farmers who derive 75 percent of their total income from farming are eligible for CFAP 2 assistance. Yet most farmers rely on off-farm jobs to supplement on-farm income, according to the family farm advocacy network Farm Aid. In 2018, median on-farm net income was actually negative: -$1,735.
Though it has not become a major election issue, U.S. farmers say a crisis of farm profitability and increase in food scarcity may be looming, short of major changes in the industrial agriculture system.
In 2009, when milk prices fell to a six-year low, then-dairy farmer Rob Bass made the difficult decision to sell all 1,000 of his cows. Bass and his family had run out of credit, and he was losing money producing milk. Their farm, in northeast Connecticut, had been in the Bass family since 1710.
Bass decided to transition the farm to growing corn and hay, which he calculated might deliver a more stable income. He did well the first few years as corn prices climbed, reaching an all-time high of $8.02 a bushel in August of 2012. But prices have been on the decline since the fall of 2012, due to corn stockpiling. A deadly disease impacting young pigs and drought in the Great Plains affecting beef cattle led to lower demand for corn feed. Bass has struggled to make ends meet. As his sister, Jenni Bass tells Truthout, while the family farm used to have a lengthy payroll, they no longer employ anyone, nor are they able to pay themselves. All income goes toward the family farm mortgage and paying property taxes. “Our margins are so small, so narrow,” Bass said.
According to a 2012 article in Nature, climate change is a major driver of corn price variability. Summer 2020 was the hottest on record in Connecticut, where the Bass family farm is located. Worldwide, the last six years have been the hottest ever recorded.
Farmers like Bass recognize that agriculture is a major contributor to greenhouse gas emissions, on account of widespread practices that deplete the soil of nutrients, causing it to leak greenhouse gases like nitrous oxide and produce runoff that spurs algal blooms. Agriculture comprises 10 percent of U.S. emissions, according to government reports.
But Bass says he’s stuck in the current system, planting genetically modified seeds, and spraying the matching herbicide. It’s expensive and makes for a less biodiverse farm, but it also requires less labor to keep weeds at bay.
“We’d love to go organic, but it would take three years to get certified,” he says, and it’s unclear what would pay the bills in the meantime.
Trapped in a Cycle of Personal and Ecological Risks
Farmer Mary Agnes Rawlings and her husband have been trying to break out of the industrial agriculture “trap” since returning to her husband’s family farm around two decades ago. One hundred ten acres of the 154-acre central Illinois farm are now organic row crops, with the help, in part, of USDA grants. On August 10, 2020, the Trump administration announced it would be reducing the amount the government reimburses farmers as part of the program to 50 percent, or $500 per project, due to “limited funding.”
“It hurts,” Rawlings said, “when farmers are already struggling with $3 dollars a bushel corn,” she says, noting that at those prices, farmers are losing fifty cents per bushel of corn they produce.
The alternative to organic, Rawlings points out, is using GMO seeds, but foods grown with them have been linked to a decline in kidney and liver function in animal studies. Putting themselves at risk takes a psychological toll, she says. Rawlings suspects her father, who worked on the family farm until he died of colon cancer, may have gotten sick because of chemicals the USDA continues to allow, like glyphosate, the synthetic herbicide under trademark by Monsanto. In spite of studies providing evidence that associates glyphosate with increased cancer risk, in January 2020, the Environmental Protection Agency announced there were “no risks of concern to human health” related to the chemical.
---
Lawmakers on both sides of the aisle continue to take donations from powerful corporations that form the backbone of the industrial agriculture system, like Bayer and Monsanto, thereby incentivizing their support of the broken agriculture system. Meanwhile farmers are trapped in a cycle of responding to the effects of climate change with solutions that further contribute to greenhouse gas emissions, rather than transitioning to diversified farming operations that could produce more high quality food and mitigate emissions fueling the climate crisis.
Losing the Ability to Feed Ourselves
In the United States, as with many parts of the world, food deserts are increasingly common. But at the global level, The New York Times has reported, 100 million more people will be at acute risk of experiencing hunger above 2 degrees Celsius of global warming, due to crop loss caused by extreme weather and a shortage of pollinators, for instance.
“You’re sort of reaching a breaking point with land itself and its ability to grow food and sustain us,” senior policy adviser on climate change at Oxfam America, Aditi Sen, told The New York Times.
While the United States has historically positioned itself to provide aid to other countries, policy experts with the Center for Strategic and International Studies suggest the U.S. must urgently look inward to address its own climate-related food security issues.
Twelve percent of the U.S. experiences food insecurity, in comparison with 6.9 percent of the population across Asia and 9.8 percent across Latin America. A 2019 brief points out that while regenerative agricultural practices like cover cropping are on the upswing, that’s only the case on a small percentage of land. The brief notes that universities might devote more resources toward climate solutions in agriculture. But academics aiming to do just that have struggled to circulate their work amid the Trump administration.
A 2019 investigation by Politico revealed that the USDA refused to publicize at least 45 government-sponsored studies detailing how U.S. agricultural operations have been and will continue to be impacted by climate change. One explains how cattle farmers in the Southern Plains might be impacted by waning water levels in the Ogallala Aquifer. Another study outlines practices that could help farmers in the Southern Mississippi Delta become more climate resilient.
As small agricultural operations throughout the U.S. attempt to adapt to more variable weather patterns, farmers like Michigan organic orchardist Tom Rosenfeld regularly consider calling it quits. Rosenfeld has lost full crops of apples to pests that are new to the area on account of the changing climate. Like many farmers, Rosenfeld supplements his farm income with other jobs. “I would have expected that by now I would have unlocked the key to a sustainable income,” he says, reflecting on when he purchased his orchard fifteen years ago.
Between 2011 and 2018, the number of farms in the U.S. has declined by almost 5 percent as farmers face similar challenges, while the average farm size has increased. With conditions for farmers as they are, Rosenfeld worries there might soon be a shortage of farmers who produce “specialty crops,” — the term the USDA has developed to refer to fruits, vegetables and other crops intended for human consumption rather than animal feed or biofuel. Amid the current agricultural system, only 2 percent of U.S. farmland is used to grow food, while 59 percent of land is used to grow commodity crops like corn and soybeans.
---
As Successful Farming reports, the “Trump bump” in farm income from programs like CFAP is set to drop precipitously, by 17 percent in 2021. “I just wish politicians actually thought about farm policy in a real way,” Bass says. “In a loving way, not just a bureaucratic way.”
But many farmers say stopgap payments like this are a far cry from what they need to navigate intersecting crises, including the corporate consolidation of farms and the climate crisis, which have shaken the foundations of the brittle farm economy. Only farmers who derive 75 percent of their total income from farming are eligible for CFAP 2 assistance. Yet most farmers rely on off-farm jobs to supplement on-farm income, according to the family farm advocacy network Farm Aid. In 2018, median on-farm net income was actually negative: -$1,735.
Though it has not become a major election issue, U.S. farmers say a crisis of farm profitability and increase in food scarcity may be looming, short of major changes in the industrial agriculture system.
In 2009, when milk prices fell to a six-year low, then-dairy farmer Rob Bass made the difficult decision to sell all 1,000 of his cows. Bass and his family had run out of credit, and he was losing money producing milk. Their farm, in northeast Connecticut, had been in the Bass family since 1710.
Bass decided to transition the farm to growing corn and hay, which he calculated might deliver a more stable income. He did well the first few years as corn prices climbed, reaching an all-time high of $8.02 a bushel in August of 2012. But prices have been on the decline since the fall of 2012, due to corn stockpiling. A deadly disease impacting young pigs and drought in the Great Plains affecting beef cattle led to lower demand for corn feed. Bass has struggled to make ends meet. As his sister, Jenni Bass tells Truthout, while the family farm used to have a lengthy payroll, they no longer employ anyone, nor are they able to pay themselves. All income goes toward the family farm mortgage and paying property taxes. “Our margins are so small, so narrow,” Bass said.
According to a 2012 article in Nature, climate change is a major driver of corn price variability. Summer 2020 was the hottest on record in Connecticut, where the Bass family farm is located. Worldwide, the last six years have been the hottest ever recorded.
Farmers like Bass recognize that agriculture is a major contributor to greenhouse gas emissions, on account of widespread practices that deplete the soil of nutrients, causing it to leak greenhouse gases like nitrous oxide and produce runoff that spurs algal blooms. Agriculture comprises 10 percent of U.S. emissions, according to government reports.
But Bass says he’s stuck in the current system, planting genetically modified seeds, and spraying the matching herbicide. It’s expensive and makes for a less biodiverse farm, but it also requires less labor to keep weeds at bay.
“We’d love to go organic, but it would take three years to get certified,” he says, and it’s unclear what would pay the bills in the meantime.
Trapped in a Cycle of Personal and Ecological Risks
Farmer Mary Agnes Rawlings and her husband have been trying to break out of the industrial agriculture “trap” since returning to her husband’s family farm around two decades ago. One hundred ten acres of the 154-acre central Illinois farm are now organic row crops, with the help, in part, of USDA grants. On August 10, 2020, the Trump administration announced it would be reducing the amount the government reimburses farmers as part of the program to 50 percent, or $500 per project, due to “limited funding.”
“It hurts,” Rawlings said, “when farmers are already struggling with $3 dollars a bushel corn,” she says, noting that at those prices, farmers are losing fifty cents per bushel of corn they produce.
The alternative to organic, Rawlings points out, is using GMO seeds, but foods grown with them have been linked to a decline in kidney and liver function in animal studies. Putting themselves at risk takes a psychological toll, she says. Rawlings suspects her father, who worked on the family farm until he died of colon cancer, may have gotten sick because of chemicals the USDA continues to allow, like glyphosate, the synthetic herbicide under trademark by Monsanto. In spite of studies providing evidence that associates glyphosate with increased cancer risk, in January 2020, the Environmental Protection Agency announced there were “no risks of concern to human health” related to the chemical.
---
Lawmakers on both sides of the aisle continue to take donations from powerful corporations that form the backbone of the industrial agriculture system, like Bayer and Monsanto, thereby incentivizing their support of the broken agriculture system. Meanwhile farmers are trapped in a cycle of responding to the effects of climate change with solutions that further contribute to greenhouse gas emissions, rather than transitioning to diversified farming operations that could produce more high quality food and mitigate emissions fueling the climate crisis.
Losing the Ability to Feed Ourselves
In the United States, as with many parts of the world, food deserts are increasingly common. But at the global level, The New York Times has reported, 100 million more people will be at acute risk of experiencing hunger above 2 degrees Celsius of global warming, due to crop loss caused by extreme weather and a shortage of pollinators, for instance.
“You’re sort of reaching a breaking point with land itself and its ability to grow food and sustain us,” senior policy adviser on climate change at Oxfam America, Aditi Sen, told The New York Times.
While the United States has historically positioned itself to provide aid to other countries, policy experts with the Center for Strategic and International Studies suggest the U.S. must urgently look inward to address its own climate-related food security issues.
Twelve percent of the U.S. experiences food insecurity, in comparison with 6.9 percent of the population across Asia and 9.8 percent across Latin America. A 2019 brief points out that while regenerative agricultural practices like cover cropping are on the upswing, that’s only the case on a small percentage of land. The brief notes that universities might devote more resources toward climate solutions in agriculture. But academics aiming to do just that have struggled to circulate their work amid the Trump administration.
A 2019 investigation by Politico revealed that the USDA refused to publicize at least 45 government-sponsored studies detailing how U.S. agricultural operations have been and will continue to be impacted by climate change. One explains how cattle farmers in the Southern Plains might be impacted by waning water levels in the Ogallala Aquifer. Another study outlines practices that could help farmers in the Southern Mississippi Delta become more climate resilient.
As small agricultural operations throughout the U.S. attempt to adapt to more variable weather patterns, farmers like Michigan organic orchardist Tom Rosenfeld regularly consider calling it quits. Rosenfeld has lost full crops of apples to pests that are new to the area on account of the changing climate. Like many farmers, Rosenfeld supplements his farm income with other jobs. “I would have expected that by now I would have unlocked the key to a sustainable income,” he says, reflecting on when he purchased his orchard fifteen years ago.
Between 2011 and 2018, the number of farms in the U.S. has declined by almost 5 percent as farmers face similar challenges, while the average farm size has increased. With conditions for farmers as they are, Rosenfeld worries there might soon be a shortage of farmers who produce “specialty crops,” — the term the USDA has developed to refer to fruits, vegetables and other crops intended for human consumption rather than animal feed or biofuel. Amid the current agricultural system, only 2 percent of U.S. farmland is used to grow food, while 59 percent of land is used to grow commodity crops like corn and soybeans.
---
As Successful Farming reports, the “Trump bump” in farm income from programs like CFAP is set to drop precipitously, by 17 percent in 2021. “I just wish politicians actually thought about farm policy in a real way,” Bass says. “In a loving way, not just a bureaucratic way.”
The Super-Rich—You Know, People Like The Trumps —Are Raking In Billions
Our Analysis of IRS Data Shows His Presidency Has Been Very Good for the Richest Americans.
By David Cay Johnston, DCReport Editor-in-Chief
9/15/2020
If you are in the 99% here is how well you are faring under Trump policies compared to the 1%: for each dollar of increased income that you earned in 2018, each One-Percenter got $88 more income.
Huge as that ratio is, it’s small change compared to the super-rich, the 0.01% of Americans with incomes of $10 million and up. That ratio is $1 for you and $2,215 for each super-rich American household. Let’s call them the Platinum-Premiere-Point-Zero-One-Percenters
Ponder that.
For each additional dollar you earned in 2018 compared to 2016, each of the Platinum-Premiere crowd got an additional $2,215.
The bottom line: with Trump as president it’s good to be rich.
The average Platinum-Premiere American enjoyed is $7.1 million more income under Trump in 2018 than in 2016, the last year that Barack Obama was president. For the Ninety-Nine-Percenters, in contrast, average income rose just $3,360 with most of that gain among those making $200,000 to $500,000.
Not Widely Reported
You haven’t heard these numbers on the nightly news or read them in your morning newspaper because no one announced them. I distilled them from an official government report known as IRS Table 1.4, a task I’ve repeated annually for a quarter-century.
At DCReport we don’t attend press conferences, we don’t rewrite press releases and we don’t depend on access to officials because other journalists do that just fine. Instead, we scour the public record for news that oozes, news that no one announced.
Last week I reported my preliminary analysis of Table 1.4, showing that 57% of American households were better off under Obama. That contradicted Trump’s naked claim, repeated uncritically and often in news reports, that he created the best economy ever until the coronavirus pandemic.
This week’s focus is on the big changes in how the American income pie is being divvied up.
More for the Top
The rich and super-rich are enjoying a bigger slice of the American income pie.
On the other hand, this is a truly awful time to be poor. Trump policies are narrowing the pockets of the poor, the third of Americans make less than $25,000. In 2018 their average income was just $12,600, a dollar a day less than in 2016.
Trump & Co. has numerous plans afoot to reduce incomes of the poor even more and take away government benefits, as we have been documenting at DCReport.
Less for the Bottom
The poor saw their slice of the national income pie shrink by 1 percentage point from 6.5% to 5.5%. In a mirror image of that change, the super-rich saw their share of income pie grow by the same 1 percentage point, from 4.5% to 5.7% of all income.
That means the richest 22,122 households now collectively enjoy more income than the poorest 50 million households.
What these huge disparities make clear is that the sum of all Trump policies not only makes the poor worse off, but their losses are transformed into the gains of the super-rich.
The economic growth that began in early 2010 when Obama was president continued under Trump, albeit at a slower pace as DCReport showed last year. Pre-pandemic Trump underperformed Reagan, Clinton, Carter and the last six years of Obama, who inherited the worst economy in almost a century.
The continuing upward trajectory for the economy meant that overall Americans made more money in 2018 than in 2016 even after adjusting for inflation of 4.1% over two years. Total income grew by almost $1 trillion to $11.6 trillion.
Half-Trillion for the One-Percenters
Almost half of the increase went to the 1%. They enjoyed $487 billion more money. The rest of America, a group 99 times larger, divvied up $511 billion.
The big winners, though, were the super-rich, the $10 million-plus crowd. That group consists of just one in every 7,000 taxpayers yet they captured every sixth dollar of increased national income, a total gain of $157 billion.
So, if you are among the 152 million American taxpayers in the 99% ask yourself whether Trump administration policies are good for you. Do you want a government of the rich, by the rich and overwhelmingly for the rich? Or would you prefer a government that benefits all Americans?
And see what you can do to make sure more Americans know about the big shifts in the way America’s income pie is being sliced up.
Huge as that ratio is, it’s small change compared to the super-rich, the 0.01% of Americans with incomes of $10 million and up. That ratio is $1 for you and $2,215 for each super-rich American household. Let’s call them the Platinum-Premiere-Point-Zero-One-Percenters
Ponder that.
For each additional dollar you earned in 2018 compared to 2016, each of the Platinum-Premiere crowd got an additional $2,215.
The bottom line: with Trump as president it’s good to be rich.
The average Platinum-Premiere American enjoyed is $7.1 million more income under Trump in 2018 than in 2016, the last year that Barack Obama was president. For the Ninety-Nine-Percenters, in contrast, average income rose just $3,360 with most of that gain among those making $200,000 to $500,000.
Not Widely Reported
You haven’t heard these numbers on the nightly news or read them in your morning newspaper because no one announced them. I distilled them from an official government report known as IRS Table 1.4, a task I’ve repeated annually for a quarter-century.
At DCReport we don’t attend press conferences, we don’t rewrite press releases and we don’t depend on access to officials because other journalists do that just fine. Instead, we scour the public record for news that oozes, news that no one announced.
Last week I reported my preliminary analysis of Table 1.4, showing that 57% of American households were better off under Obama. That contradicted Trump’s naked claim, repeated uncritically and often in news reports, that he created the best economy ever until the coronavirus pandemic.
This week’s focus is on the big changes in how the American income pie is being divvied up.
More for the Top
The rich and super-rich are enjoying a bigger slice of the American income pie.
On the other hand, this is a truly awful time to be poor. Trump policies are narrowing the pockets of the poor, the third of Americans make less than $25,000. In 2018 their average income was just $12,600, a dollar a day less than in 2016.
Trump & Co. has numerous plans afoot to reduce incomes of the poor even more and take away government benefits, as we have been documenting at DCReport.
Less for the Bottom
The poor saw their slice of the national income pie shrink by 1 percentage point from 6.5% to 5.5%. In a mirror image of that change, the super-rich saw their share of income pie grow by the same 1 percentage point, from 4.5% to 5.7% of all income.
That means the richest 22,122 households now collectively enjoy more income than the poorest 50 million households.
What these huge disparities make clear is that the sum of all Trump policies not only makes the poor worse off, but their losses are transformed into the gains of the super-rich.
The economic growth that began in early 2010 when Obama was president continued under Trump, albeit at a slower pace as DCReport showed last year. Pre-pandemic Trump underperformed Reagan, Clinton, Carter and the last six years of Obama, who inherited the worst economy in almost a century.
The continuing upward trajectory for the economy meant that overall Americans made more money in 2018 than in 2016 even after adjusting for inflation of 4.1% over two years. Total income grew by almost $1 trillion to $11.6 trillion.
Half-Trillion for the One-Percenters
Almost half of the increase went to the 1%. They enjoyed $487 billion more money. The rest of America, a group 99 times larger, divvied up $511 billion.
The big winners, though, were the super-rich, the $10 million-plus crowd. That group consists of just one in every 7,000 taxpayers yet they captured every sixth dollar of increased national income, a total gain of $157 billion.
So, if you are among the 152 million American taxpayers in the 99% ask yourself whether Trump administration policies are good for you. Do you want a government of the rich, by the rich and overwhelmingly for the rich? Or would you prefer a government that benefits all Americans?
And see what you can do to make sure more Americans know about the big shifts in the way America’s income pie is being sliced up.
Retirements, layoffs, labor force flight may leave scars on U.S economy
By Howard Schneider, Jonnelle Marte - reuters
SEPTEMBER 14, 2020
(Reuters) - Judith Ramirez received a letter this month that she’d been dreading: The Honolulu hotel that furloughed her from a housekeeping job in March, during the lockdown triggered by the coronavirus pandemic, made her layoff permanent.
Ramirez, 40, was originally told she might be called back after business picked up. But infections increased in Hawaii over the summer and quarantine restrictions for visitors were extended, a blow to the state’s tourism-dependent hotels.
Six months into the pandemic, evidence of longer-term damage to the U.S. labor market is emerging, according to separate analyses of detailed monthly jobs data by labor economists and Reuters.
Retirements are drifting up, women aren’t reengaging with the job market quickly, and “temporary” furloughs like Ramirez’s are becoming permanent - trends that could weigh on the U.S. economic recovery in the short term as well as the country’s prospects in the long term.
Economic growth depends on how many people work. If more retire, or are kept from the job market because of childcare or health and safety issues, growth is slower.
“In the first few months of the recession we were much more focused on how many jobs could come back, how many jobs could be preserved,” said Kathryn Anne Edwards, a labor economist at RAND Corp. “Now the question is really how much damage has this done.”
The U.S. economic drag is falling heavily on two groups, women here and older workers, who fueled here a rise in labor force participation prior to the pandemic. That supported stronger-than-expected economic growth in 2018 and 2019, and showed how a historically low unemployment rate drew people back into jobs.
Those workers may now be getting stranded. Women and workers aged 65 and older make up a disproportionate share of the 3.7 million people no longer working or actively seeking a job since the pandemic hit, Labor Department data show.
People 65 and older made up less than 7% of the workforce in February, but 17% of those who have left the labor market through August. Women previously accounted for 47% of the workforce, but make up 54% of the departed.
Initial evidence of longer-term trouble is starting to show in the monthly Current Population Survey (CPS) that forms the basis of regular government employment reports.
After a spike in women leaving the labor force in the early months of the pandemic, particularly to tend to family responsibilities, there’s been slower movement back into jobs compared to the months before the pandemic, according to an analysis of CPS data by Nick Bunker, economic research director for North America at the Indeed Hiring Lab.
The percentage of women and men who moved from employed to out of the labor force jumped as the pandemic layoffs hit in April. The number of women, however, who cited child care or family responsibilities as the reason, increased 178%, while the number of men citing it less than doubled, Bunker’s analysis showed.
The percentage of those women moving in the other direction month to month - from caring for family into a job - meanwhile has dropped, to a low of 5% in April from 6.6% in 2019, though it rose to 5.8% in July. It is lower for men too.
The data “suggests ... that being out of the labor force for family reasons is a ‘stickier’ state” than prior to the pandemic, Bunker said.
The Center for Retirement Research at Boston College found CPS data shows a rising share of workers 65 and older are calling it quits, a development many economists expected given the risk COVID-19 poses to older people.
Nearly a fifth of that age group working as of July 2019 were retired as of July of this year, compared to 17% for the prior year, the center’s research concluded. The percentage of these workers who consider themselves “retired” instead of merely out of work also rose steadily in recent months, from 14.2% in April to 19.5% in June.
“It is something we expected might happen - that people who were close to retirement might transition earlier,” said Anqi Chen, the center’s assistant director for savings research.
Ramirez, 40, was originally told she might be called back after business picked up. But infections increased in Hawaii over the summer and quarantine restrictions for visitors were extended, a blow to the state’s tourism-dependent hotels.
Six months into the pandemic, evidence of longer-term damage to the U.S. labor market is emerging, according to separate analyses of detailed monthly jobs data by labor economists and Reuters.
Retirements are drifting up, women aren’t reengaging with the job market quickly, and “temporary” furloughs like Ramirez’s are becoming permanent - trends that could weigh on the U.S. economic recovery in the short term as well as the country’s prospects in the long term.
Economic growth depends on how many people work. If more retire, or are kept from the job market because of childcare or health and safety issues, growth is slower.
“In the first few months of the recession we were much more focused on how many jobs could come back, how many jobs could be preserved,” said Kathryn Anne Edwards, a labor economist at RAND Corp. “Now the question is really how much damage has this done.”
The U.S. economic drag is falling heavily on two groups, women here and older workers, who fueled here a rise in labor force participation prior to the pandemic. That supported stronger-than-expected economic growth in 2018 and 2019, and showed how a historically low unemployment rate drew people back into jobs.
Those workers may now be getting stranded. Women and workers aged 65 and older make up a disproportionate share of the 3.7 million people no longer working or actively seeking a job since the pandemic hit, Labor Department data show.
People 65 and older made up less than 7% of the workforce in February, but 17% of those who have left the labor market through August. Women previously accounted for 47% of the workforce, but make up 54% of the departed.
Initial evidence of longer-term trouble is starting to show in the monthly Current Population Survey (CPS) that forms the basis of regular government employment reports.
After a spike in women leaving the labor force in the early months of the pandemic, particularly to tend to family responsibilities, there’s been slower movement back into jobs compared to the months before the pandemic, according to an analysis of CPS data by Nick Bunker, economic research director for North America at the Indeed Hiring Lab.
The percentage of women and men who moved from employed to out of the labor force jumped as the pandemic layoffs hit in April. The number of women, however, who cited child care or family responsibilities as the reason, increased 178%, while the number of men citing it less than doubled, Bunker’s analysis showed.
The percentage of those women moving in the other direction month to month - from caring for family into a job - meanwhile has dropped, to a low of 5% in April from 6.6% in 2019, though it rose to 5.8% in July. It is lower for men too.
The data “suggests ... that being out of the labor force for family reasons is a ‘stickier’ state” than prior to the pandemic, Bunker said.
The Center for Retirement Research at Boston College found CPS data shows a rising share of workers 65 and older are calling it quits, a development many economists expected given the risk COVID-19 poses to older people.
Nearly a fifth of that age group working as of July 2019 were retired as of July of this year, compared to 17% for the prior year, the center’s research concluded. The percentage of these workers who consider themselves “retired” instead of merely out of work also rose steadily in recent months, from 14.2% in April to 19.5% in June.
“It is something we expected might happen - that people who were close to retirement might transition earlier,” said Anqi Chen, the center’s assistant director for savings research.
The Ugly Numbers Are Finally In On The 2017 Trump Tax Rewrite
Radical Republicans Ramrodded that Law through Congress, and, You Guessed It, the Rich Made Out Like Bandits While the Rest of Us Got Bupkis
By David Cay Johnston, DCReport Editor-in-Chief
9/5/2020
The first data showing how all Americans are faring under Donald Trump reveal the poor and working classes sinking slightly, the middle class treading water, the upper-middle class growing and the richest, well, luxuriating in rising rivers of greenbacks.
More than half of Americans had to make ends meet in 2018 on less money than in 2016, my analysis of new income and tax data shows.
The nearly 87 million taxpayers making less than $50,000 had to get by in 2018 on $307 less per household than in 2016, the year before Trump took office, I find.
That 57% of American households were better off under Obama contradicts Trump’s often-repeated claim he created the best economy ever until the pandemic.
The worsened economic situation for more than half of Americans contradicts Trump’s frequent claims that he is the champion of the “forgotten man” and his vow that “every decision” on taxes “will be made to benefit American workers and American families.”
The figures in this story come from my annual analysis of IRS data known as Table 1.4. The income figures are pre-tax money that must be reported on tax returns. I adjusted the 2016 data to reflect inflation of 4.1% between 2016 and 2018 (slightly more than 2% a year).
This is the first data on the first full year when Trump was president. It also is the first year of the Radical Republican tax system overhaul, passed in December 2017. The Trump tax law, the most significant tax policy change since 1986, was passed without a single public hearing or a single Democratic vote.
High Income Households Multiply
Trump policies overwhelmingly favor the top 7% of Americans. And, oh, do they benefit!
Prosperous and rich people, the data reveal, include half a million who are not even filing tax returns. Yet they are not being pursued as tax cheats, a separate report shows.
The number of households enjoying incomes of $200,000 or more soared by more than 20%. The number of taxpayers making $10 million or more soared 37% to a record 22,112 households.
Who Saves on Taxes
The Trump/Republican tax savings were highly concentrated up the income ladder with hardly any tax savings going to the working poor and only a smidgen to the middle class.
Those making $50,000 to $100,000 for example, paid just three-fourths of 1 percentage point less of their incomes to our federal government. People making $2 million to $2.5 million saw their effective tax rate fall by about three times that much.
Now let’s compare two groups, those making $50,000 to $100,000 and those declaring $500,000 to $1 million. The second group averaged nine times as much income as the first group in 2018.
Under the Trump tax law, the first group’s annual income taxes declined on average by $143, while the second group’s tax reduction averaged $17,800.
Put another way, a group that made nine times as much money enjoyed about 125 times as much in income tax savings.
This disparity helps explain Trump’s support among money-conscious high-income Americans. But given the tiny tax benefits for most Americans, along with cuts in government services, it is surprising Trump enjoys significant support among people making less than $200,000.
But realize none of the biggest news organizations do the kind of analysis you are reading, at least not since I left The New York Times a dozen years ago. Instead, the major news organizations quote Trump’s claims and others’ challenges without citing details.
Understating Incomes
The figures I cite here understate actual incomes at the top for two reasons. One is that loopholes and Congressional favors allow many rich and superrich Americans to report much less income than they actually enjoy. Often they get to defer for years or decades reporting income earned today.
Second, with Trump’s support Congress has cut IRS staffing so deeply that the service cannot even pursue growing armies of rich people who have stopped filing tax returns. The sharp decline in IRS auditing means tax cheating—always a low-risk crime—has become much less risky.
Trump Ignores Rich Tax Cheats
In the three years ending in 2016, the IRS identified 879,415 high-income Americans who did not even bother to file. These tax cheats owed an estimated $45.7 billion in taxes, the treasury inspector general for Tax Administration reported May 29.
Under Trump more than half a million cases of high-income Americans who didn’t file a tax return “will likely not be pursued,” the inspector general wrote.
One of the Koch brothers was under IRS criminal investigation until Trump assumed office and the service abruptly dropped the case. DCReport’s five-part series last year showed, from a thousand pages of documents, that William Ingraham Koch, who lives one door away from Mar-a-Lago, is collecting more than $100 million a year without paying income taxes.
Borrowing to Help the Rich
Trump’s tax law will require at least $1.5 trillion in added federal debt because it falls far short of paying for itself through increased economic growth even without the pandemic. Most of the tax savings were showered on rich Americans and the corporations they control. Most of the negative effects will fall on the middle class and poor Americans in the form of Trump’s efforts to reduce government services.
The 2017 income tax law caused only a slight decline in the share of adjusted gross income that Americans paid to Uncle Sam, known as the effective tax rate. Adjusted gross income is the last line on the front page of your tax return and is in the measure used in my analysis.
The overall effective tax rate slipped from 14.7% under Obama to 14.2% under Trump.
Curious Anomaly
In what might seem at first blush a curious development, Americans making more than $10 million received a below-average cut in their effective tax rate. The effective tax rate for these 22,000 households declined by less than half a percent.
The reason for that smaller-than-average decline is that these super-rich Americans depend less on paychecks and much more on capital gains and dividends that have long been taxed at lower rates than paycheck earnings.
The new tax data also show a sharp shift away from income from work and toward income from investments, a trend which bodes poorly for working people but very nicely for those who control businesses, invest in stocks and have other sources of income from capital.
Overall the share of American income from wages and salaries fell significantly, from almost 71% in 2016 to less than 68% in 2018.
Meanwhile, if you look just at the slice of the American income pie derived from business ownership and investments, it expanded by nearly one-tenth in two years. Income from such investments is highly concentrated among the richest Americans.
Infuriating Fact
There’s one more enlightening and perhaps infuriating detail I sussed from the IRS data.
The number of households making $1 million or more but paying no income taxes soared 41% under the new Trump tax law. Under Obama, there were just 394 such households. With Trump, this grew to 556 households making on average $3.5 million without contributing one cent to our government.
Again, Trump seems to have forgotten all about the Forgotten Man. But he’s busy doing all he can to help the rich, then stick you with their tax bills.
More than half of Americans had to make ends meet in 2018 on less money than in 2016, my analysis of new income and tax data shows.
The nearly 87 million taxpayers making less than $50,000 had to get by in 2018 on $307 less per household than in 2016, the year before Trump took office, I find.
That 57% of American households were better off under Obama contradicts Trump’s often-repeated claim he created the best economy ever until the pandemic.
The worsened economic situation for more than half of Americans contradicts Trump’s frequent claims that he is the champion of the “forgotten man” and his vow that “every decision” on taxes “will be made to benefit American workers and American families.”
The figures in this story come from my annual analysis of IRS data known as Table 1.4. The income figures are pre-tax money that must be reported on tax returns. I adjusted the 2016 data to reflect inflation of 4.1% between 2016 and 2018 (slightly more than 2% a year).
This is the first data on the first full year when Trump was president. It also is the first year of the Radical Republican tax system overhaul, passed in December 2017. The Trump tax law, the most significant tax policy change since 1986, was passed without a single public hearing or a single Democratic vote.
High Income Households Multiply
Trump policies overwhelmingly favor the top 7% of Americans. And, oh, do they benefit!
Prosperous and rich people, the data reveal, include half a million who are not even filing tax returns. Yet they are not being pursued as tax cheats, a separate report shows.
The number of households enjoying incomes of $200,000 or more soared by more than 20%. The number of taxpayers making $10 million or more soared 37% to a record 22,112 households.
Who Saves on Taxes
The Trump/Republican tax savings were highly concentrated up the income ladder with hardly any tax savings going to the working poor and only a smidgen to the middle class.
Those making $50,000 to $100,000 for example, paid just three-fourths of 1 percentage point less of their incomes to our federal government. People making $2 million to $2.5 million saw their effective tax rate fall by about three times that much.
Now let’s compare two groups, those making $50,000 to $100,000 and those declaring $500,000 to $1 million. The second group averaged nine times as much income as the first group in 2018.
Under the Trump tax law, the first group’s annual income taxes declined on average by $143, while the second group’s tax reduction averaged $17,800.
Put another way, a group that made nine times as much money enjoyed about 125 times as much in income tax savings.
This disparity helps explain Trump’s support among money-conscious high-income Americans. But given the tiny tax benefits for most Americans, along with cuts in government services, it is surprising Trump enjoys significant support among people making less than $200,000.
But realize none of the biggest news organizations do the kind of analysis you are reading, at least not since I left The New York Times a dozen years ago. Instead, the major news organizations quote Trump’s claims and others’ challenges without citing details.
Understating Incomes
The figures I cite here understate actual incomes at the top for two reasons. One is that loopholes and Congressional favors allow many rich and superrich Americans to report much less income than they actually enjoy. Often they get to defer for years or decades reporting income earned today.
Second, with Trump’s support Congress has cut IRS staffing so deeply that the service cannot even pursue growing armies of rich people who have stopped filing tax returns. The sharp decline in IRS auditing means tax cheating—always a low-risk crime—has become much less risky.
Trump Ignores Rich Tax Cheats
In the three years ending in 2016, the IRS identified 879,415 high-income Americans who did not even bother to file. These tax cheats owed an estimated $45.7 billion in taxes, the treasury inspector general for Tax Administration reported May 29.
Under Trump more than half a million cases of high-income Americans who didn’t file a tax return “will likely not be pursued,” the inspector general wrote.
One of the Koch brothers was under IRS criminal investigation until Trump assumed office and the service abruptly dropped the case. DCReport’s five-part series last year showed, from a thousand pages of documents, that William Ingraham Koch, who lives one door away from Mar-a-Lago, is collecting more than $100 million a year without paying income taxes.
Borrowing to Help the Rich
Trump’s tax law will require at least $1.5 trillion in added federal debt because it falls far short of paying for itself through increased economic growth even without the pandemic. Most of the tax savings were showered on rich Americans and the corporations they control. Most of the negative effects will fall on the middle class and poor Americans in the form of Trump’s efforts to reduce government services.
The 2017 income tax law caused only a slight decline in the share of adjusted gross income that Americans paid to Uncle Sam, known as the effective tax rate. Adjusted gross income is the last line on the front page of your tax return and is in the measure used in my analysis.
The overall effective tax rate slipped from 14.7% under Obama to 14.2% under Trump.
Curious Anomaly
In what might seem at first blush a curious development, Americans making more than $10 million received a below-average cut in their effective tax rate. The effective tax rate for these 22,000 households declined by less than half a percent.
The reason for that smaller-than-average decline is that these super-rich Americans depend less on paychecks and much more on capital gains and dividends that have long been taxed at lower rates than paycheck earnings.
The new tax data also show a sharp shift away from income from work and toward income from investments, a trend which bodes poorly for working people but very nicely for those who control businesses, invest in stocks and have other sources of income from capital.
Overall the share of American income from wages and salaries fell significantly, from almost 71% in 2016 to less than 68% in 2018.
Meanwhile, if you look just at the slice of the American income pie derived from business ownership and investments, it expanded by nearly one-tenth in two years. Income from such investments is highly concentrated among the richest Americans.
Infuriating Fact
There’s one more enlightening and perhaps infuriating detail I sussed from the IRS data.
The number of households making $1 million or more but paying no income taxes soared 41% under the new Trump tax law. Under Obama, there were just 394 such households. With Trump, this grew to 556 households making on average $3.5 million without contributing one cent to our government.
Again, Trump seems to have forgotten all about the Forgotten Man. But he’s busy doing all he can to help the rich, then stick you with their tax bills.
FINANCING RACISM AND DEATH!!!
The Fed Invested Public Money in Fossil Fuel Firms Driving Environmental Racism
BY Alexis Goldstein, Truthout
PUBLISHED September 3, 2020
In April, the price of oil fell so far so fast that oil futures contracts went negative. Oil prices have recovered somewhat, but not enough to ever return the industry to its prior state. Analysts predict the U.S. has reached peak oil production and will never again “return to the record 13 million barrels of oil per day reached in November 2019.” ExxonMobil, which has been a part of the Dow Jones Industrial Average market index for 92 years, was removed last week and replaced with the software firm Salesforce. But while financial analysts sound the alarm about this dying industry, the Federal Reserve has been buying up the debt of fossil fuel companies through a pandemic emergency program. We the public, together with the Fed, now own over $315 million in bonds of fossil fuel firms, including those with a track record of environmental racism.
Among the many Fed rescue programs is the Secondary Market Corporate Credit Facility (SMCCF), which buys corporate debt of companies. The program is supporting fossil fuel corporations in notably disproportionate numbers: More than 10 percent of the Fed’s bond purchases are fossil fuel companies, even though fossil fuel firms only employ 2 percent of all workers employed by firms in the S&P 1500 stock market index.
These bonds are effectively a public investment because the Fed is leveraging $25 billion from the CARES Act as a down payment for the SMCCF’s bond purchases. Together with the Fed, the public now holds the corporate bonds of ExxonMobil, Chevron, BP, Phillips 66 and Noble Energy. And it seems likely the Fed will be holding these fossil fuel bonds until they mature — up to five years into the future for some of them — based on comments Fed Chair Jerome Powell made to Congress.
At a time when students at universities are calling for their endowments to divest from fossil fuels, and shareholders are pressuring banks to stop financing fossil fuel projects, the world’s largest central bank is sending a different signal: that fossil fuel debt is worth owning, and safe enough to be held by the Fed and the public. This is very different from the signals the market is sending about the industry. Exxon wrote in a memo leaked to the press that due to “prolonged negative market impact” in the sector, it is weighing layoffs in the U.S. and other high-cost countries. And while the energy sector made up nearly 16 percent of the S&P 500 market index in 2008, it is now the smallest sector in the index, making up just over 2 percent. As former Federal Reserve Board Governor Sarah Bloom Raskin wrote, the Fed purchasing the debt of oil and gas firms not only “sends a false price signal to investors” about the viability of these companies, it also “increases the likelihood that investors will be stuck with stranded oil and gas assets that society no longer needs.”
As of August 10, the public now owns $15 million in bonds of dirty energy firm Marathon Petroleum. (You can see all the Fed’s bond purchases in the SMCCF in the spreadsheets they post online.) Marathon is the largest U.S. oil refining company with 16 refineries nationwide, including one in Detroit, Michigan. The company’s Detroit refinery, which sits in the majority-Black Boynton and Oakwood Heights neighborhoods, has violated Michigan state emissions limits 15 times since 2013. Those who live near the refinery have faced large, fiery flares in October 2018 and vomiting or labored breathing in February 2019 due to the rupture of a propane line and malfunction of its gas flare. After a toxic chemical leak from the refinery in September 2019, Rep. Rashida Tlaib (D-Michigan) said that Marathon “cannot be trusted to protect our health,” and held a congressional field hearing to highlight the problems.
Marathon’s violations extend well beyond Detroit. The Political Economy Research Institute lists Marathon as the 33rd worst air polluter in the nation, and its refineries disproportionately impact communities of color. In 2016, Marathon released more than 35,800 gallons of diesel into a river near Crawleyville, Indiana, which resulted in a $335,000 penalty for violating the Clean Water Act. In a 2016 settlement with the Department of Justice and the Environmental Protection Agency (EPA), Marathon agreed to spend $334 million to settle a dispute over refinery pollution in five states: Michigan, Louisiana, Ohio, Kentucky and Illinois. This was a follow-up to a 2012 agreement with the EPA regarding Marathon’s violations of the Clean Air Act.
Even before the Fed invested in Marathon’s bonds, the company was already enjoying support from the nation’s policy makers thanks to a $1.1 billion tax break it received in the CARES Act. The tax change allowed companies to apply current losses to past tax returns, which is especially beneficial to oil firms like Marathon because it will help them offset the record profits they made in 2018 using the massive losses they’ve seen in 2020 as the price of oil crashed.
The public is now also invested in Energy Transfer Operating, the company behind the 1,172-mile Dakota Access Pipeline that faced Indigenous-led resistance at the Standing Rock Sioux reservation. The Fed has purchased $28.5 million in corporate bonds from Energy Transfer to date. The brutal, violent crackdowns at Standing Rock included both surveillance and a counterintelligence campaign led by Energy Transfer’s hired security team, TigerSwan. Energy Transfer is also behind the 167-mile Bayou Bridge Pipeline in Louisiana that crosses through Black and Indigenous communities, raising serious health and safety concerns. In 2018, Energy Transfer’s security team sank two boats carrying 15 Water Protectors and members of the press at a Bayou Bridge Pipeline construction site. Energy Transfer has also been embroiled in controversy as it has tried to build the Revolution Pipeline in Pennsylvania. The pipeline has been out of service since 2018, when it ruptured and exploded, and has faced 596 new environmental violations since January.
The Fed and the public also own $24 million worth of bonds in the utility giant Southern Company and its subsidiaries, including Georgia Power and Southern Power. In 2016, the NAACP gave Southern Company an “F” corporate environmental justice performance rating due to its high emission of pollutants like sulfur dioxide and nitrogen oxides that cause health issues, noting that there are many low-income people and people of color living within three miles of its coal plants. The Southern Company-owned utility Georgia Power plans to begin shutting off the electricity of people in default on their utility bills. Advocates warn those most at risk of losing power are Black, Latinx and Native American. Georgia Power also owns Plant Bowen, soon to be the biggest coal plant in the U.S. Another Southern Company subsidiary, Virginia Natural Gas, is facing resistance from communities of color for its proposal to build a new natural gas pipeline. The proposal includes a compressor station that would impact a neighborhood with high populations of low-income communities and communities of color, according to Stop the Abuse of Virginian Energy, a coalition formed to oppose the pipeline. And in 2018, Southern Company subsidiary Alabama Power was found to have polluted the groundwater with arsenic, lithium and selenium at a plant in majority-Black Greene County, Alabama.
Donald Trump has long been trying to turn back time and prop up the dying fossil fuel industry. First it was campaigning on coal in 2016. Then it was his pledge to “never let the great U.S. Oil & Gas Industry down” in 2020. Energy Secretary Dan Brouillette told Bloomberg TV that the Fed also worked with the Energy and Treasury Departments to change the rules of one of its lending programs to address the concerns of the fossil fuel lobby. And now, the Fed has made the public unwitting investors in fossil fuel debt.
One of the Fed’s key stated missions is “promoting financial stability.” Climate catastrophe presents both physical and transition risks to the financial system overall, as documented by Graham Steele and Gregg Gelzinis. Now that a decade of bad bets and heavy indebtedness have combined with low oil prices to decimate profits in the fossil fuel industry, there’s a real risk that the Fed’s investments in these firms will cost the public in the future. The Fed should ensure that it is taking these climate-related financial risks into account. In addition, the president of the Federal Reserve Bank of Atlanta, Raphael Bostic, wrote that the Fed “can play an important role in helping to reduce racial inequities and bring about a more inclusive economy.” The Fed’s purchase of fossil fuel debt of companies exacerbating environmental racism moves in the opposite direction of both these roles. The Fed should not be using public money to help prop up a dying industry that endangers our health and safety. The Fed should stop leveraging public funds to buy risky fossil fuel debt.
Among the many Fed rescue programs is the Secondary Market Corporate Credit Facility (SMCCF), which buys corporate debt of companies. The program is supporting fossil fuel corporations in notably disproportionate numbers: More than 10 percent of the Fed’s bond purchases are fossil fuel companies, even though fossil fuel firms only employ 2 percent of all workers employed by firms in the S&P 1500 stock market index.
These bonds are effectively a public investment because the Fed is leveraging $25 billion from the CARES Act as a down payment for the SMCCF’s bond purchases. Together with the Fed, the public now holds the corporate bonds of ExxonMobil, Chevron, BP, Phillips 66 and Noble Energy. And it seems likely the Fed will be holding these fossil fuel bonds until they mature — up to five years into the future for some of them — based on comments Fed Chair Jerome Powell made to Congress.
At a time when students at universities are calling for their endowments to divest from fossil fuels, and shareholders are pressuring banks to stop financing fossil fuel projects, the world’s largest central bank is sending a different signal: that fossil fuel debt is worth owning, and safe enough to be held by the Fed and the public. This is very different from the signals the market is sending about the industry. Exxon wrote in a memo leaked to the press that due to “prolonged negative market impact” in the sector, it is weighing layoffs in the U.S. and other high-cost countries. And while the energy sector made up nearly 16 percent of the S&P 500 market index in 2008, it is now the smallest sector in the index, making up just over 2 percent. As former Federal Reserve Board Governor Sarah Bloom Raskin wrote, the Fed purchasing the debt of oil and gas firms not only “sends a false price signal to investors” about the viability of these companies, it also “increases the likelihood that investors will be stuck with stranded oil and gas assets that society no longer needs.”
As of August 10, the public now owns $15 million in bonds of dirty energy firm Marathon Petroleum. (You can see all the Fed’s bond purchases in the SMCCF in the spreadsheets they post online.) Marathon is the largest U.S. oil refining company with 16 refineries nationwide, including one in Detroit, Michigan. The company’s Detroit refinery, which sits in the majority-Black Boynton and Oakwood Heights neighborhoods, has violated Michigan state emissions limits 15 times since 2013. Those who live near the refinery have faced large, fiery flares in October 2018 and vomiting or labored breathing in February 2019 due to the rupture of a propane line and malfunction of its gas flare. After a toxic chemical leak from the refinery in September 2019, Rep. Rashida Tlaib (D-Michigan) said that Marathon “cannot be trusted to protect our health,” and held a congressional field hearing to highlight the problems.
Marathon’s violations extend well beyond Detroit. The Political Economy Research Institute lists Marathon as the 33rd worst air polluter in the nation, and its refineries disproportionately impact communities of color. In 2016, Marathon released more than 35,800 gallons of diesel into a river near Crawleyville, Indiana, which resulted in a $335,000 penalty for violating the Clean Water Act. In a 2016 settlement with the Department of Justice and the Environmental Protection Agency (EPA), Marathon agreed to spend $334 million to settle a dispute over refinery pollution in five states: Michigan, Louisiana, Ohio, Kentucky and Illinois. This was a follow-up to a 2012 agreement with the EPA regarding Marathon’s violations of the Clean Air Act.
Even before the Fed invested in Marathon’s bonds, the company was already enjoying support from the nation’s policy makers thanks to a $1.1 billion tax break it received in the CARES Act. The tax change allowed companies to apply current losses to past tax returns, which is especially beneficial to oil firms like Marathon because it will help them offset the record profits they made in 2018 using the massive losses they’ve seen in 2020 as the price of oil crashed.
The public is now also invested in Energy Transfer Operating, the company behind the 1,172-mile Dakota Access Pipeline that faced Indigenous-led resistance at the Standing Rock Sioux reservation. The Fed has purchased $28.5 million in corporate bonds from Energy Transfer to date. The brutal, violent crackdowns at Standing Rock included both surveillance and a counterintelligence campaign led by Energy Transfer’s hired security team, TigerSwan. Energy Transfer is also behind the 167-mile Bayou Bridge Pipeline in Louisiana that crosses through Black and Indigenous communities, raising serious health and safety concerns. In 2018, Energy Transfer’s security team sank two boats carrying 15 Water Protectors and members of the press at a Bayou Bridge Pipeline construction site. Energy Transfer has also been embroiled in controversy as it has tried to build the Revolution Pipeline in Pennsylvania. The pipeline has been out of service since 2018, when it ruptured and exploded, and has faced 596 new environmental violations since January.
The Fed and the public also own $24 million worth of bonds in the utility giant Southern Company and its subsidiaries, including Georgia Power and Southern Power. In 2016, the NAACP gave Southern Company an “F” corporate environmental justice performance rating due to its high emission of pollutants like sulfur dioxide and nitrogen oxides that cause health issues, noting that there are many low-income people and people of color living within three miles of its coal plants. The Southern Company-owned utility Georgia Power plans to begin shutting off the electricity of people in default on their utility bills. Advocates warn those most at risk of losing power are Black, Latinx and Native American. Georgia Power also owns Plant Bowen, soon to be the biggest coal plant in the U.S. Another Southern Company subsidiary, Virginia Natural Gas, is facing resistance from communities of color for its proposal to build a new natural gas pipeline. The proposal includes a compressor station that would impact a neighborhood with high populations of low-income communities and communities of color, according to Stop the Abuse of Virginian Energy, a coalition formed to oppose the pipeline. And in 2018, Southern Company subsidiary Alabama Power was found to have polluted the groundwater with arsenic, lithium and selenium at a plant in majority-Black Greene County, Alabama.
Donald Trump has long been trying to turn back time and prop up the dying fossil fuel industry. First it was campaigning on coal in 2016. Then it was his pledge to “never let the great U.S. Oil & Gas Industry down” in 2020. Energy Secretary Dan Brouillette told Bloomberg TV that the Fed also worked with the Energy and Treasury Departments to change the rules of one of its lending programs to address the concerns of the fossil fuel lobby. And now, the Fed has made the public unwitting investors in fossil fuel debt.
One of the Fed’s key stated missions is “promoting financial stability.” Climate catastrophe presents both physical and transition risks to the financial system overall, as documented by Graham Steele and Gregg Gelzinis. Now that a decade of bad bets and heavy indebtedness have combined with low oil prices to decimate profits in the fossil fuel industry, there’s a real risk that the Fed’s investments in these firms will cost the public in the future. The Fed should ensure that it is taking these climate-related financial risks into account. In addition, the president of the Federal Reserve Bank of Atlanta, Raphael Bostic, wrote that the Fed “can play an important role in helping to reduce racial inequities and bring about a more inclusive economy.” The Fed’s purchase of fossil fuel debt of companies exacerbating environmental racism moves in the opposite direction of both these roles. The Fed should not be using public money to help prop up a dying industry that endangers our health and safety. The Fed should stop leveraging public funds to buy risky fossil fuel debt.
reality!!!
No, Trump didn’t create ‘the greatest economy in history’ — or even in the last few years
August 30, 2020
By John A. Tures - raw story
The most repeated phrase of the Republican National Convention was that President Donald Trump created the greatest economy the world has ever seen. The evidence, however, shows that it’s not the greatest in world’s history, or American history. It may not be even better than the Obama economy, even before the pandemic devastated our economy.
GDP Growth: I went to Macrotrends.net to gather all of the data on annual growth rates from 1961 to 2019, before the pandemic. In his best year, 2018, Trump’s growth rate was 3.18 percent, good for 29th best since JFK took office. In 2019, the GDP growth rate was 2.33 percent (41st best) and in 2017, it was 2.22 percent (44th best) out of all 59 years. In 1950, our first quarter GDP growth rate was 16.7 percent, according to Trading Economics.
And that’s before 2020. The Bureau of Economic Analysis revealed that our economy shrank by 5 percent during the first quarter, and a historic low -32 percent in the second quarter, as confirmed by Trading Economics.
Unemployment: President Barack Obama inherited the Great Recession, which began at the end of the George W. Bush administration. The unemployment rate was 9.9 percent in 2009. It declined every year down to 4.7 percent in 2016, according to Forbes. It has further declined under Trump to 3.5 percent in 2019 but then shot up to 14.7 percent in April in the midst of the pandemic. As of this writing, it is still in the double-digits.
Annual Median Household Income: As the Joint Economic Committee concluded: “During the last two years of the Obama administration, annual median household income increased $4,800. This is three times more than the $1,400 increase during the first two years of the Trump administration.” And this was published before the declines of 2020.
Stock Market: I expected that the stock market would have done better for Trump than Obama. I was incorrect. According to the JEC: “Between President Trump’s Inauguration Day and November 2019, the Dow Jones Average increased more than 40 percent, while over the eight years of the Obama administration, the DJIA increased almost 150 percent – a substantially greater pace.” The stock market had declined so much before Obama took office, so his numbers look better primarily for this reason.
Trade Deficit: Given his rhetoric on the trade and all of his claims to success. I was surprised to discover that under Trump the trade deficit became worse than it did under the Obama administration. “Despite his aggressive trade moves, the total monthly trade deficit in goods and services — that is, the value of goods and services exported by the US minus the value of imports — has gotten larger in the last couple years,” confirms Business Insider.
There is no evidence at the writing of this column that this was the greatest economy in the history of the world, or America, over even the last several years. Times had been generally okay since the Great Recession of 2007-09 for both presidents, until the calamity of 2020.
GDP Growth: I went to Macrotrends.net to gather all of the data on annual growth rates from 1961 to 2019, before the pandemic. In his best year, 2018, Trump’s growth rate was 3.18 percent, good for 29th best since JFK took office. In 2019, the GDP growth rate was 2.33 percent (41st best) and in 2017, it was 2.22 percent (44th best) out of all 59 years. In 1950, our first quarter GDP growth rate was 16.7 percent, according to Trading Economics.
And that’s before 2020. The Bureau of Economic Analysis revealed that our economy shrank by 5 percent during the first quarter, and a historic low -32 percent in the second quarter, as confirmed by Trading Economics.
Unemployment: President Barack Obama inherited the Great Recession, which began at the end of the George W. Bush administration. The unemployment rate was 9.9 percent in 2009. It declined every year down to 4.7 percent in 2016, according to Forbes. It has further declined under Trump to 3.5 percent in 2019 but then shot up to 14.7 percent in April in the midst of the pandemic. As of this writing, it is still in the double-digits.
Annual Median Household Income: As the Joint Economic Committee concluded: “During the last two years of the Obama administration, annual median household income increased $4,800. This is three times more than the $1,400 increase during the first two years of the Trump administration.” And this was published before the declines of 2020.
Stock Market: I expected that the stock market would have done better for Trump than Obama. I was incorrect. According to the JEC: “Between President Trump’s Inauguration Day and November 2019, the Dow Jones Average increased more than 40 percent, while over the eight years of the Obama administration, the DJIA increased almost 150 percent – a substantially greater pace.” The stock market had declined so much before Obama took office, so his numbers look better primarily for this reason.
Trade Deficit: Given his rhetoric on the trade and all of his claims to success. I was surprised to discover that under Trump the trade deficit became worse than it did under the Obama administration. “Despite his aggressive trade moves, the total monthly trade deficit in goods and services — that is, the value of goods and services exported by the US minus the value of imports — has gotten larger in the last couple years,” confirms Business Insider.
There is no evidence at the writing of this column that this was the greatest economy in the history of the world, or America, over even the last several years. Times had been generally okay since the Great Recession of 2007-09 for both presidents, until the calamity of 2020.
'White supremacy' was behind child separations — and Trump officials went along, critics say
"The consequences have been horrific,” historian Monica Muñoz Martinez said of not challenging the humanity or morality of the administration's policy.
Aug. 22, 2020, 8:23 AM PDT
By Suzanne Gamboa - nbc news
An NBC News report that Stephen Miller, President Donald Trump’s senior adviser and immigration policy architect, called for a show of hands among senior officials on separating children from parents is being called “a damning display of white supremacy” and a repeat of crimes against humanity seen through history.
In a meeting of 11 senior advisers, Miller warned that not enforcing the administration’s "zero tolerance" immigration policy “is the end of our country as we know it" and that opposing it would be un-American, according to two officials who were there.
When senior administration officials raised their hands and voted in favor of separating children from their migrant parents, they failed to speak against Miller's assertion or its consequences, said Brown University historian Monica Muñoz Martinez.
The vote on the child separations “is just one of those examples of people in power sitting behind desks and making decisions about what kinds of harm they’re going to inflict for political purpose, and the consequences have been horrific,” said Martinez, author of “The Injustice Never Leaves You,” which chronicled government anti-Mexican American and and anti-Mexican violence in Texas.
Miller, administration officials told NBC News, "saw the separation of families not as an unfortunate byproduct but as a tool to deter more immigration. According to three former officials, he had devised plans that would have separated even more children."
No one who attended the meeting argued on the children’s behalf or on the humanity or morality of separating the largely Central American families.
The White House denied to NBC News that the show-of-hands vote took place, saying, “This is absolutely not true and did not happen.”
Almost 3,000 children were separated from their parents during the administration's zero tolerance policy.
Miller’s promotion and espousing of white nationalist ideas is known: It was demonstrated last year in a trove of emails released by the Southern Poverty Law Center, a civil rights group that tracks white supremacy groups.
Miller cited and promoted white nationalist ideologies of white genocide, immigrants as criminals and eugenics. Those emails led to demands from many Democratic lawmakers and others for Miller to step down, which has not happened.
The vote and the subsequent child separations are “a damning display of white supremacy in action,” said Julián Castro, who served as President Barack Obama’s housing secretary.
Castro said the administration officials' vote is bizarre, particularly since former Homeland Security Secretary Kirstjen Nielsen — the Cabinet secretary whose agency would have had the primary responsibility of carrying out the policy — had strenuously objected to the separations. She was outvoted.
“That’s what happens when you put white supremacists like Stephen Miller in charge of policy and give them free rein to change important policy when it comes to immigrants," Castro said.
Nielsen had repeatedly expressed concerns before the meeting about losing track of children and did not raise her hand in favor of the plan, according to the officials in the meeting. But she later signed the memo that set in motion the prosecutions of migrants, including parents with children who would be separated.
No clear records as a way to 'hide their crimes'
Martinez noted that when attempts are made at reconciliation after human atrocities, a clear record of what took place is needed. The American public shouldn’t be pulled into denials that the vote happened, she said.
“One of the ways governments hide their crimes is by not leaving a clear record, or by leaving a record that justifies their violence by covering their crimes or describing victims as criminals or dangerous,” she said.
“Not having a clear record of all the children that were separated from their parents, not creating a clear record of how to track those children and those parents led to these devastating challenges of being able to reunify these children,” Martinez said.
Rep. Nydia Velázquez, D-N.Y., reacted to the revelations Friday with a tweet that said, “The cruelty has always been the point.”
Before rolling out the zero tolerance policy across the southern border, the Trump administration tested it in El Paso, Texas.
The congresswoman who represents El Paso, Rep. Veronica Escobar, D-Texas, has been one of the most vocal in Congress to call for Miller’s resignation and to blast Republican silence on Miller’s espousal of white nationalist ideologies.
In a tweet Thursday, she quoted the NBC News story regarding the vote and stated, “What followed is a dark chapter in America’s history. The family separation policy inflicted harm and pain on thousands of children and corroded our deepest values.”
Martinez said the U.S. will have to create a truth and reconciliation commission to deal with consequences to people affected by the policy Miller pushed and the Trump administration enacted.
The "crimes against humanity" that have been inflicted along the border, Martinez said, are happening outside public view.
“The burden for calling for justice,” she said, "has fallen on the victims themselves and their attorneys, the advocates for immigrants and refugees and the doctors who cried out and said this is child abuse.”
In a meeting of 11 senior advisers, Miller warned that not enforcing the administration’s "zero tolerance" immigration policy “is the end of our country as we know it" and that opposing it would be un-American, according to two officials who were there.
When senior administration officials raised their hands and voted in favor of separating children from their migrant parents, they failed to speak against Miller's assertion or its consequences, said Brown University historian Monica Muñoz Martinez.
The vote on the child separations “is just one of those examples of people in power sitting behind desks and making decisions about what kinds of harm they’re going to inflict for political purpose, and the consequences have been horrific,” said Martinez, author of “The Injustice Never Leaves You,” which chronicled government anti-Mexican American and and anti-Mexican violence in Texas.
Miller, administration officials told NBC News, "saw the separation of families not as an unfortunate byproduct but as a tool to deter more immigration. According to three former officials, he had devised plans that would have separated even more children."
No one who attended the meeting argued on the children’s behalf or on the humanity or morality of separating the largely Central American families.
The White House denied to NBC News that the show-of-hands vote took place, saying, “This is absolutely not true and did not happen.”
Almost 3,000 children were separated from their parents during the administration's zero tolerance policy.
Miller’s promotion and espousing of white nationalist ideas is known: It was demonstrated last year in a trove of emails released by the Southern Poverty Law Center, a civil rights group that tracks white supremacy groups.
Miller cited and promoted white nationalist ideologies of white genocide, immigrants as criminals and eugenics. Those emails led to demands from many Democratic lawmakers and others for Miller to step down, which has not happened.
The vote and the subsequent child separations are “a damning display of white supremacy in action,” said Julián Castro, who served as President Barack Obama’s housing secretary.
Castro said the administration officials' vote is bizarre, particularly since former Homeland Security Secretary Kirstjen Nielsen — the Cabinet secretary whose agency would have had the primary responsibility of carrying out the policy — had strenuously objected to the separations. She was outvoted.
“That’s what happens when you put white supremacists like Stephen Miller in charge of policy and give them free rein to change important policy when it comes to immigrants," Castro said.
Nielsen had repeatedly expressed concerns before the meeting about losing track of children and did not raise her hand in favor of the plan, according to the officials in the meeting. But she later signed the memo that set in motion the prosecutions of migrants, including parents with children who would be separated.
No clear records as a way to 'hide their crimes'
Martinez noted that when attempts are made at reconciliation after human atrocities, a clear record of what took place is needed. The American public shouldn’t be pulled into denials that the vote happened, she said.
“One of the ways governments hide their crimes is by not leaving a clear record, or by leaving a record that justifies their violence by covering their crimes or describing victims as criminals or dangerous,” she said.
“Not having a clear record of all the children that were separated from their parents, not creating a clear record of how to track those children and those parents led to these devastating challenges of being able to reunify these children,” Martinez said.
Rep. Nydia Velázquez, D-N.Y., reacted to the revelations Friday with a tweet that said, “The cruelty has always been the point.”
Before rolling out the zero tolerance policy across the southern border, the Trump administration tested it in El Paso, Texas.
The congresswoman who represents El Paso, Rep. Veronica Escobar, D-Texas, has been one of the most vocal in Congress to call for Miller’s resignation and to blast Republican silence on Miller’s espousal of white nationalist ideologies.
In a tweet Thursday, she quoted the NBC News story regarding the vote and stated, “What followed is a dark chapter in America’s history. The family separation policy inflicted harm and pain on thousands of children and corroded our deepest values.”
Martinez said the U.S. will have to create a truth and reconciliation commission to deal with consequences to people affected by the policy Miller pushed and the Trump administration enacted.
The "crimes against humanity" that have been inflicted along the border, Martinez said, are happening outside public view.
“The burden for calling for justice,” she said, "has fallen on the victims themselves and their attorneys, the advocates for immigrants and refugees and the doctors who cried out and said this is child abuse.”
GREED VS HEALTH!!!
N.Y. Nurses Say Used N95 Masks are Being Re-Packed in Boxes to Look New
Jocelyn Grzeszczak - NEWSWEEK
8/21/2020
Nurses working at an intensive care unit in a New York hospital say they discovered used N95 respirator masks in a storage closet that are being made to look new and distributed for use by hospital officials.
The ICU nurses sent a video to News 12 Sunday showing dozens of the N95 masks, which provide the best protection against COVID-19, hanging on a clothesline in a storage closet at the Vassar Brothers Medical Center (VBMC) in Poughkeepsie.
The nurses said the hospital was recycling the masks and boxing them up to make them appear new, something hospital administrators denied.
"All those masks were there to be packed away and saved," VBMC President Peter Kelly told News 12 Thursday, adding that VBMC and other hospitals around the country have saved masks to be sanitized and reused only if there's another national shortage of personal protective equipment (PPE), like the one seen earlier this year when the pandemic first hit the U.S.
"One of the things that we were doing was drying them before we packaged them away and we have never had to use them," Kelly said.
VBMC officials maintain it has a three-month supply of PPE, as required by New York state officials.
But the nurses, speaking anonymously to News 12 for fear of retribution, disagreed. They remain worried about their health and safety while at work in the ICU, and one nurse told News 12 she received a mask that was marked as new but had makeup stains on it.
Mid Hudson News obtained photos of the allegedly used masks on Monday and forwarded it to VBMC administrators.
"We have never used recycled masks at Vassar Brothers Medical Center," VBMC spokesman John Nelson told Mid Hudson News, adding that "if the photo was taken here, it would have been at a time, early in the pandemic, when there was a national shortage of masks and other personal protective equipment. We were preparing for a possible worst-case scenario, which would have included properly sterilizing and reusing masks."
Newsweek contacted VBMC for further comment, but did not hear back in time for publication.
Experts have published studies detailing how medical-grade masks can be successfully decontaminated. Two University of Illinois at Urbana-Champaign professors recently discovered that electric cookers, like rice cookers or Instant Pots, could be an effective way of sterilizing used masks for repeated use, The Washington Post reported.
The extremely warm temperatures and dry heat produced by the electric cookers cleaned the masks without damaging fit or filtration efficiency, the professors' study found.
The Centers for Disease Control and Prevention (CDC) previously released its own set of recommendations for reusing the respirator masks, writing that there was no way of determining the maximum possible number of safe reuses for N95 masks.
"Hang used respirators in a designated storage area or keep them in a clean, breathable container such as a paper bag between uses," the recommendations state. "To minimize potential cross-contamination, store respirators so that they do not touch each other and the person using the respirator is clearly identified. Storage containers should be disposed of or cleaned regularly."
The ICU nurses sent a video to News 12 Sunday showing dozens of the N95 masks, which provide the best protection against COVID-19, hanging on a clothesline in a storage closet at the Vassar Brothers Medical Center (VBMC) in Poughkeepsie.
The nurses said the hospital was recycling the masks and boxing them up to make them appear new, something hospital administrators denied.
"All those masks were there to be packed away and saved," VBMC President Peter Kelly told News 12 Thursday, adding that VBMC and other hospitals around the country have saved masks to be sanitized and reused only if there's another national shortage of personal protective equipment (PPE), like the one seen earlier this year when the pandemic first hit the U.S.
"One of the things that we were doing was drying them before we packaged them away and we have never had to use them," Kelly said.
VBMC officials maintain it has a three-month supply of PPE, as required by New York state officials.
But the nurses, speaking anonymously to News 12 for fear of retribution, disagreed. They remain worried about their health and safety while at work in the ICU, and one nurse told News 12 she received a mask that was marked as new but had makeup stains on it.
Mid Hudson News obtained photos of the allegedly used masks on Monday and forwarded it to VBMC administrators.
"We have never used recycled masks at Vassar Brothers Medical Center," VBMC spokesman John Nelson told Mid Hudson News, adding that "if the photo was taken here, it would have been at a time, early in the pandemic, when there was a national shortage of masks and other personal protective equipment. We were preparing for a possible worst-case scenario, which would have included properly sterilizing and reusing masks."
Newsweek contacted VBMC for further comment, but did not hear back in time for publication.
Experts have published studies detailing how medical-grade masks can be successfully decontaminated. Two University of Illinois at Urbana-Champaign professors recently discovered that electric cookers, like rice cookers or Instant Pots, could be an effective way of sterilizing used masks for repeated use, The Washington Post reported.
The extremely warm temperatures and dry heat produced by the electric cookers cleaned the masks without damaging fit or filtration efficiency, the professors' study found.
The Centers for Disease Control and Prevention (CDC) previously released its own set of recommendations for reusing the respirator masks, writing that there was no way of determining the maximum possible number of safe reuses for N95 masks.
"Hang used respirators in a designated storage area or keep them in a clean, breathable container such as a paper bag between uses," the recommendations state. "To minimize potential cross-contamination, store respirators so that they do not touch each other and the person using the respirator is clearly identified. Storage containers should be disposed of or cleaned regularly."
What to Know When Someone Blames Black People for All the Riots
by Paul Buchheit | the smirking chimp
August 18, 2020 - 7:09am
It's easy for lazy thinkers to gloss over the deep-seated reasons for the anger and violence of downtrodden people. Recent lootings of upscale shops on Chicago's Magnificent Mile brought out waves of disgust for the 'thugs' who were said to care little for human values.
But an honest review of the facts should compel a rational thinker to redirect his or her disgust—away from racist conclusions, and toward the outrageous indecencies that a well-to-do society forces on its less fortunate members.
1. For Every $100.00 in Median White Wealth, Black People Have $8.70
In 1968, it was $9.43. It keeps getting worse for beaten-down Americans. In the past twenty years, median income for Black households has dropped 5 percent while increasing 6 percent for White households. This year the Black and Latino communities have suffered the greatest effects of the coronavirus pandemic. Because of their job losses and lack of savings and inability to maintain rent payments, they will be taking the brunt of a growing housing crisis, especially in the big cities, where rent for Millennials is already over two-thirds of their incomes. Black and Latino households are also more like to face food insufficiencies. Yet the Trump administration recently considered cuts to the food stamp program.
2. Congress Won't Provide Living-Wage Jobs
When asked what he would do to bring jobs to Kentucky, Mitch McConnell responded, "That is not my job. It is the primary responsibility of the state Commerce Cabinet."
His hypocrisy becomes evident when we consider the Republican demand for work requirements for Medicaid and food stamps and other forms of social programs. Millionaires scream "get a job" from their positions of ignorance. But in recent years almost all the new jobs have been temporary or contract-based, with no benefits and no job security.
There is plenty of potential work out there: infrastructure projects for roads, bridges, transit systems, and wastewater treatment; the planting of trees and development of parkland; postal service upgrades; the construction of barriers to protect against sea-level rise and hurricanes. And more work indoors: home care workers to aid the aged and disabled; child care; special needs programs; and not the least a resurgence in arts initiatives.
Of course there is opposition to federal job initiatives, from by-the-book economists and self-serving free-market advocates who fear that "big government" will cut into their stock market gains.
But there is evidence for the success of guaranteed job programs, starting with the Depression-era Works Progress Administration, which put 8.5 million Americans to work building new roads, bridges, and parks around the country. More recently, according to the Georgetown Law Center on Poverty and Inequality, subsidized employment programs have "reduced family public benefit receipt, raised school outcomes among the children of workers, boosted workers’ school completion, lowered criminal justice system involvement among both workers and their children, improved psychological well-being, and reduced longer-term poverty."
3. Big Tech Won't Provide Living-Wage Jobs, Even After Decades of Taxpayer Subsidies
After building their businesses on 70 years of taxpayer-funded research and development, Apple/Amazon/Google/Microsoft/Facebook have expanded their composite net worth to almost $7 trillion, while avoiding over a hundred billion dollars in taxes. This disdain for Americans certainly warrants a protest. Big Tech has appropriated decades of taxpayer-funded research all for the benefit of stockholders who keep adding millions to their portfolios.
Insult is added to injury. Said an Amazon Chief Technologist: "The more robots we add to our fulfillment centers, the more jobs we are creating."
In reality, as MIT economist Daron Acemoglu explains, "Look at the business model of Google, Facebook, Netflix. They’re not in the business of creating new tasks for humans." Big Tech would have us believe that while their machines handle the mundane tasks, more sophisticated humans will become "bot builders" and app designers. But just how many of us can become "bot builders"?
The old argument that the loss of jobs to technology has always been followed by a new and better class of work becomes meaningless when the machines start doing our thinking for us. Tech CEO Rob LoCascio says that a call center bot can respond to 10,000 queries in an hour, compared to six for a human being. JP Morgan says that AI programs can review commercial loan agreements in a few seconds, compared to thousands of hours of lawyers' time.
And just as Covid is hitting Black people harder than White people, so AI is displacing Black workers at a faster pace than White workers, because of their greater representation in vulnerable areas like food service, office administration, and transportation.
4. Even Martin Luther King Realized that Non-Violent Protest was Not Enough to Effect Change
Martin Luther King is remembered for peaceful protests, starting with the Montgomery, Alabama bus boycott in the mid-1950s and continuing through his Poor People's March in 1968. But he understood the role of violence. In his 1963 "Letter from a Birmingham Jail" he wrote, "We know through painful experience that freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed." And in 1968 he said: "I think that we’ve got to see that a riot is the language of the unheard."
At times for Reverend King the violence was triggered in response to a peaceful but provocative act. He once helped initiate a Children's Crusade of young marchers, knowing that racist backlash would elicit public sympathy. Sure enough, Birmingham Commissioner of Public Safety Bull Connor ordered police to attack the children with powerful fire hoses and batons, and to threaten them with police dogs.
Historian Daniel Q. Gillion summarizes: "Nonviolent protest brings awareness to an issue; violent protest brings urgency to an issue."
A Progressive Condemnation of Violence
It might appear that some of the above condones violent protests. Just the opposite. Lasting peace on the streets can only be attained through long-term measures that seek equality in living-wage job opportunities. Nothing should keep Congress from implementing a guaranteed jobs program. The money's always there. As Senator Brian Schatz, D-Hawaii, said: "After the Republicans did the $1.5 trillion in unpaid-for tax cuts...I just reject the idea that only progressive ideas have to be paid for."
But an honest review of the facts should compel a rational thinker to redirect his or her disgust—away from racist conclusions, and toward the outrageous indecencies that a well-to-do society forces on its less fortunate members.
1. For Every $100.00 in Median White Wealth, Black People Have $8.70
In 1968, it was $9.43. It keeps getting worse for beaten-down Americans. In the past twenty years, median income for Black households has dropped 5 percent while increasing 6 percent for White households. This year the Black and Latino communities have suffered the greatest effects of the coronavirus pandemic. Because of their job losses and lack of savings and inability to maintain rent payments, they will be taking the brunt of a growing housing crisis, especially in the big cities, where rent for Millennials is already over two-thirds of their incomes. Black and Latino households are also more like to face food insufficiencies. Yet the Trump administration recently considered cuts to the food stamp program.
2. Congress Won't Provide Living-Wage Jobs
When asked what he would do to bring jobs to Kentucky, Mitch McConnell responded, "That is not my job. It is the primary responsibility of the state Commerce Cabinet."
His hypocrisy becomes evident when we consider the Republican demand for work requirements for Medicaid and food stamps and other forms of social programs. Millionaires scream "get a job" from their positions of ignorance. But in recent years almost all the new jobs have been temporary or contract-based, with no benefits and no job security.
There is plenty of potential work out there: infrastructure projects for roads, bridges, transit systems, and wastewater treatment; the planting of trees and development of parkland; postal service upgrades; the construction of barriers to protect against sea-level rise and hurricanes. And more work indoors: home care workers to aid the aged and disabled; child care; special needs programs; and not the least a resurgence in arts initiatives.
Of course there is opposition to federal job initiatives, from by-the-book economists and self-serving free-market advocates who fear that "big government" will cut into their stock market gains.
But there is evidence for the success of guaranteed job programs, starting with the Depression-era Works Progress Administration, which put 8.5 million Americans to work building new roads, bridges, and parks around the country. More recently, according to the Georgetown Law Center on Poverty and Inequality, subsidized employment programs have "reduced family public benefit receipt, raised school outcomes among the children of workers, boosted workers’ school completion, lowered criminal justice system involvement among both workers and their children, improved psychological well-being, and reduced longer-term poverty."
3. Big Tech Won't Provide Living-Wage Jobs, Even After Decades of Taxpayer Subsidies
After building their businesses on 70 years of taxpayer-funded research and development, Apple/Amazon/Google/Microsoft/Facebook have expanded their composite net worth to almost $7 trillion, while avoiding over a hundred billion dollars in taxes. This disdain for Americans certainly warrants a protest. Big Tech has appropriated decades of taxpayer-funded research all for the benefit of stockholders who keep adding millions to their portfolios.
Insult is added to injury. Said an Amazon Chief Technologist: "The more robots we add to our fulfillment centers, the more jobs we are creating."
In reality, as MIT economist Daron Acemoglu explains, "Look at the business model of Google, Facebook, Netflix. They’re not in the business of creating new tasks for humans." Big Tech would have us believe that while their machines handle the mundane tasks, more sophisticated humans will become "bot builders" and app designers. But just how many of us can become "bot builders"?
The old argument that the loss of jobs to technology has always been followed by a new and better class of work becomes meaningless when the machines start doing our thinking for us. Tech CEO Rob LoCascio says that a call center bot can respond to 10,000 queries in an hour, compared to six for a human being. JP Morgan says that AI programs can review commercial loan agreements in a few seconds, compared to thousands of hours of lawyers' time.
And just as Covid is hitting Black people harder than White people, so AI is displacing Black workers at a faster pace than White workers, because of their greater representation in vulnerable areas like food service, office administration, and transportation.
4. Even Martin Luther King Realized that Non-Violent Protest was Not Enough to Effect Change
Martin Luther King is remembered for peaceful protests, starting with the Montgomery, Alabama bus boycott in the mid-1950s and continuing through his Poor People's March in 1968. But he understood the role of violence. In his 1963 "Letter from a Birmingham Jail" he wrote, "We know through painful experience that freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed." And in 1968 he said: "I think that we’ve got to see that a riot is the language of the unheard."
At times for Reverend King the violence was triggered in response to a peaceful but provocative act. He once helped initiate a Children's Crusade of young marchers, knowing that racist backlash would elicit public sympathy. Sure enough, Birmingham Commissioner of Public Safety Bull Connor ordered police to attack the children with powerful fire hoses and batons, and to threaten them with police dogs.
Historian Daniel Q. Gillion summarizes: "Nonviolent protest brings awareness to an issue; violent protest brings urgency to an issue."
A Progressive Condemnation of Violence
It might appear that some of the above condones violent protests. Just the opposite. Lasting peace on the streets can only be attained through long-term measures that seek equality in living-wage job opportunities. Nothing should keep Congress from implementing a guaranteed jobs program. The money's always there. As Senator Brian Schatz, D-Hawaii, said: "After the Republicans did the $1.5 trillion in unpaid-for tax cuts...I just reject the idea that only progressive ideas have to be paid for."
US unions
How much does union membership benefit America's workers?
Even as membership shrinks, official data shows a clear fact: members earn more than non-members
Mona Chalabi in New York
the guardian
Sun 24 Nov 2019 03.00 EST
Union membership in the US has fallen dramatically over the last generation. In 1980, one in five workers were in a union, but today it’s just one in 10. Despite that, data shows being in a union is still very effective in protecting the rights of workers and their earnings.
Data on full-time wage and salary workers comes from the Bureau of Labor Statistics (BLS). The bureau’s numbers on weekly earnings show a clear and consistent fact: members of unions earn more than non-members.
The earnings of those represented by unions were almost identical to the earnings of those who are members (for the charts below, I used membership). Women who are members of unions earn $10 more per week than men who aren’t members of a union.
The racial pay gap is so large that although union membership has a big impact, it still does not quite bridge the difference. Black union members earn $63 less each week than their white counterparts who are non-union. Hispanic workers who are non-union have some of the lowest earnings in the country – just $657 per week, almost half of what white union members can expect.
Data on full-time wage and salary workers comes from the Bureau of Labor Statistics (BLS). The bureau’s numbers on weekly earnings show a clear and consistent fact: members of unions earn more than non-members.
The earnings of those represented by unions were almost identical to the earnings of those who are members (for the charts below, I used membership). Women who are members of unions earn $10 more per week than men who aren’t members of a union.
The racial pay gap is so large that although union membership has a big impact, it still does not quite bridge the difference. Black union members earn $63 less each week than their white counterparts who are non-union. Hispanic workers who are non-union have some of the lowest earnings in the country – just $657 per week, almost half of what white union members can expect.
American billionaires’ ties to Moscow go back decades
July 15, 2019
By History News Network - Raw Story
Sleazy American businessmen who make deals with corrupt Russians with shadowy connections to intelligence agents or powerful politicians did not spring up de novo in the twentieth-first century. As far back as 1919, Armand Hammer, the future head of Occidental Petroleum, met with Vladimir Lenin and agreed to a truly corrupt arrangement which enabled his family to obtain a lucrative business concession in the Soviet Union in return for financing illegal communist operations in Europe and the fledgling American Communist Party.
In the 1970s, Hammer introduced another American businessman to his friends in Moscow. Although David Karr, the former CEO of Fairbanks Whitney, never became a household name like his sponsor, he made millions of dollars doing business with the USSR, while providing the KGB with an entrée into the American political world. While less financially successful than Hammer, Karr, the subject of my new biography, The Millionaire Was a Soviet More: The Twisted Life of David Karr, lived an even more adventurous life, ricocheting from young Communist to government bureaucrat, from newsman to public relations man, from proxy fighter to international businessman, from Hollywood mogul to the hotel business, and finally to a KGB informant. When he died in Paris in 1979 at the age of 60, his Machiavellian past and his myriad of enemies inspired speculation in the French press that he had been murdered by the KGB, CIA, Israeli Mossad, or the Mafia. Remarkably, there were scenarios- not all plausible- that could finger any one of that unlikely quartet as his killer.
Born into a middle-class Jewish family in New York, David Katz (as he was then known) was attracted to the CPUSA by its militant anti-fascism and began writing for its newspaper, the Daily Worker in the 1930s. For the rest of his life, however, he insisted he had never joined the Party. His early flirtation with communism, though, haunted him for more than twenty years. He went to work for the Office of War Information during World War II, but was forced to resign after being denounced by Congressman Martin Dies for his past associations. Becoming syndicated columnist Drew Pearson’s chief “leg man,” he was denounced on the floor of the Senate by Senator Joseph McCarthy as Pearson’s KGB controller. He was a frequent target of right-wing columnists, particularly Westbrook Pegler. Throughout the 1940s and 1950s Karr himself tried to cozy up to the FBI, claiming he had provided information on Communists.
In the early 1950s he moved back to New York and joined a public relations firm, eventually setting up his own business and developing a specialty in working for corporate raiders during proxy wars. With another shady character, Alfons Landa, as his partner, in 1959 Karr successfully took control of Fairbanks Whitney, a diversified company that held a number of defense contracts. One of the youngest CEOs of a major American corporation, Karr faced problems getting a security clearance, but even more difficulty running a company. Within four years, amid constant losses, an exodus of high-ranking personnel, shareholder complaints about excessive executive salaries, and the failure of a major investment in a water desalinization initiative in Israel, Karr was forced out. Undaunted, he relocated to Broadway and Hollywood, serving as a co-producer for several movies and shows.
Karr’s personal life was almost as tumultuous as his business career. Already married and divorced twice, with four children, he was engaged to a glamorous Hollywood actress when he met a wealthy French woman. Breaking the engagement, he married and moved to Paris where her parents bought them a luxurious apartment. Parlaying his wife’s connections, Karr brokered the sale of several luxury hotels to British magnate Charles Forte and became general manager of the iconic George V. From his new perch in French society, he did business deals with Aristotle Onassis, befriended America’s ambassador to France Sargent Shriver, and established ties to the Lazard Freres investment house.
Through Shriver, Karr began the final phase of his peripatetic career. He was introduced to Armand Hammer and he accompanied the owner of Occidental Petroleum to Moscow. Developing close ties to Djerman Gvishiani, a high-ranking Soviet trade official and Premier Kosygin’s son-in-law, Karr soon became a key figure negotiating for Western companies hoping to gain access to the Soviet market.
His access to Soviet officials and lucrative contracts sparked rumors about possible intelligence connections. Not only did Karr arrange the financing of the first Western hotel built in the USSR in the run-up to the Moscow Olympics, but he controlled the North American rights to Misha the Bear, the mascot for the Games, in partnership with former Senator John Tunney. He and Hammer also had the rights to market Olympic commemorative coins. Amid published claims that he had bribed Gvishiani, and with labor troubles afflicting Forte’s Paris hotels, Karr became ill in Moscow in July 1979 at events celebrating the opening of the Kosmos Hotel. He died of an apparent heart attack shortly after returning to Paris.
His personal life also took a dramatic turn in the late 1970s. Divorcing his French wife, he married a German model, more than thirty years his junior. In less than a year of marriage, there were two divorce filings. Shunned by his family, his widow made a series of sensational charges in the French press, alleging he had been murdered. Journalists retailed stories from anonymous sources that he had cheated the Soviets on the coin deal and that he and an Israeli partner had run guns to Idi Amin and Muammar Khadaffi. A British journalist published a sensational book alleging that he and Onassis had arranged the murder of Bobby Kennedy with the PLO.
Untangling these lurid accusations, I have been able to establish that Karr was recruited as source by the KGB in the early 1970s. He never had access to top-secret information but he did report about the presidential campaigns of Shriver, Scoop Jackson, Jerry Brown and Jimmy Carter. He tried to insinuate himself in the Gerald Ford White House. He probably also worked for the Mossad. Was it any wonder that many people and organizations had good reason to dislike and fear David Karr?
His secretive empire with corporations, trusts, and entities in several countries required a decade before probate was completed, as three ex-wives, one widow, and five children wrangled over the estate. His complicated legacy is well worth pursuing, illustrating the intersecting worlds of business, journalism, politics and espionage.
In the 1970s, Hammer introduced another American businessman to his friends in Moscow. Although David Karr, the former CEO of Fairbanks Whitney, never became a household name like his sponsor, he made millions of dollars doing business with the USSR, while providing the KGB with an entrée into the American political world. While less financially successful than Hammer, Karr, the subject of my new biography, The Millionaire Was a Soviet More: The Twisted Life of David Karr, lived an even more adventurous life, ricocheting from young Communist to government bureaucrat, from newsman to public relations man, from proxy fighter to international businessman, from Hollywood mogul to the hotel business, and finally to a KGB informant. When he died in Paris in 1979 at the age of 60, his Machiavellian past and his myriad of enemies inspired speculation in the French press that he had been murdered by the KGB, CIA, Israeli Mossad, or the Mafia. Remarkably, there were scenarios- not all plausible- that could finger any one of that unlikely quartet as his killer.
Born into a middle-class Jewish family in New York, David Katz (as he was then known) was attracted to the CPUSA by its militant anti-fascism and began writing for its newspaper, the Daily Worker in the 1930s. For the rest of his life, however, he insisted he had never joined the Party. His early flirtation with communism, though, haunted him for more than twenty years. He went to work for the Office of War Information during World War II, but was forced to resign after being denounced by Congressman Martin Dies for his past associations. Becoming syndicated columnist Drew Pearson’s chief “leg man,” he was denounced on the floor of the Senate by Senator Joseph McCarthy as Pearson’s KGB controller. He was a frequent target of right-wing columnists, particularly Westbrook Pegler. Throughout the 1940s and 1950s Karr himself tried to cozy up to the FBI, claiming he had provided information on Communists.
In the early 1950s he moved back to New York and joined a public relations firm, eventually setting up his own business and developing a specialty in working for corporate raiders during proxy wars. With another shady character, Alfons Landa, as his partner, in 1959 Karr successfully took control of Fairbanks Whitney, a diversified company that held a number of defense contracts. One of the youngest CEOs of a major American corporation, Karr faced problems getting a security clearance, but even more difficulty running a company. Within four years, amid constant losses, an exodus of high-ranking personnel, shareholder complaints about excessive executive salaries, and the failure of a major investment in a water desalinization initiative in Israel, Karr was forced out. Undaunted, he relocated to Broadway and Hollywood, serving as a co-producer for several movies and shows.
Karr’s personal life was almost as tumultuous as his business career. Already married and divorced twice, with four children, he was engaged to a glamorous Hollywood actress when he met a wealthy French woman. Breaking the engagement, he married and moved to Paris where her parents bought them a luxurious apartment. Parlaying his wife’s connections, Karr brokered the sale of several luxury hotels to British magnate Charles Forte and became general manager of the iconic George V. From his new perch in French society, he did business deals with Aristotle Onassis, befriended America’s ambassador to France Sargent Shriver, and established ties to the Lazard Freres investment house.
Through Shriver, Karr began the final phase of his peripatetic career. He was introduced to Armand Hammer and he accompanied the owner of Occidental Petroleum to Moscow. Developing close ties to Djerman Gvishiani, a high-ranking Soviet trade official and Premier Kosygin’s son-in-law, Karr soon became a key figure negotiating for Western companies hoping to gain access to the Soviet market.
His access to Soviet officials and lucrative contracts sparked rumors about possible intelligence connections. Not only did Karr arrange the financing of the first Western hotel built in the USSR in the run-up to the Moscow Olympics, but he controlled the North American rights to Misha the Bear, the mascot for the Games, in partnership with former Senator John Tunney. He and Hammer also had the rights to market Olympic commemorative coins. Amid published claims that he had bribed Gvishiani, and with labor troubles afflicting Forte’s Paris hotels, Karr became ill in Moscow in July 1979 at events celebrating the opening of the Kosmos Hotel. He died of an apparent heart attack shortly after returning to Paris.
His personal life also took a dramatic turn in the late 1970s. Divorcing his French wife, he married a German model, more than thirty years his junior. In less than a year of marriage, there were two divorce filings. Shunned by his family, his widow made a series of sensational charges in the French press, alleging he had been murdered. Journalists retailed stories from anonymous sources that he had cheated the Soviets on the coin deal and that he and an Israeli partner had run guns to Idi Amin and Muammar Khadaffi. A British journalist published a sensational book alleging that he and Onassis had arranged the murder of Bobby Kennedy with the PLO.
Untangling these lurid accusations, I have been able to establish that Karr was recruited as source by the KGB in the early 1970s. He never had access to top-secret information but he did report about the presidential campaigns of Shriver, Scoop Jackson, Jerry Brown and Jimmy Carter. He tried to insinuate himself in the Gerald Ford White House. He probably also worked for the Mossad. Was it any wonder that many people and organizations had good reason to dislike and fear David Karr?
His secretive empire with corporations, trusts, and entities in several countries required a decade before probate was completed, as three ex-wives, one widow, and five children wrangled over the estate. His complicated legacy is well worth pursuing, illustrating the intersecting worlds of business, journalism, politics and espionage.
imports from mexico
America’s biggest lie
by Les Leopold / Common Dreams - alternet
April 23, 2019
Pundits and politicians repeatedly warn us that the country cannot afford costly social services. They caution about the perils of a rising national debt, the supposed near bankruptcy of Medicare and Social Security, and the need to sell public services to the highest bidder in order to save them. We must tighten our belts sooner or later, they tell us, rather than spend on social goods like universal health care, free higher education and badly needed infrastructure.
To many Americans this sounds all too true because they are having an incredibly tough time making ends meet. According to the Federal Reserve, “Four in 10 adults in 2017 would either borrow, sell something, or not be able pay if faced with a $400 emergency expense.” To those who are so highly stressed financially, the idea of paying for a costly program like Medicare for All sounds impossible.
We are the richest country in the history of the world, however, and certainly could afford vastly expanded and improved vital public services if we had the will. What stands in the way is runaway inequality. Our nation’s wealth has been hijacked by the super-rich, with plenty of aid from their paid-for politicians.
Over the past 40 years the top fraction of the top one percent have systematically denied working people the fruits of their enormous productivity. The results of this wage theft can be seen clearly in the chart below which tracks productivity (output per hour of labor) and average weekly wages (after accounting for inflation) of non-supervisory and production workers (about 85 percent of the workforce).
The red line shows the rise of American productivity since WWII. While not a perfect measurement of the power of our economy, it does capture our overall level of knowledge, technical skills, worker effort, management systems, and technology. Ever-rising productivity is the key to the wealth of nations.
As we can see clearly, the productivity trend has been ever upward. Today the average worker is nearly three times as productive per hour of labor as he or she was at the end of WWII. And the workforce is more than three times as large. That means we are a colossal economic powerhouse. But unless you are an economic elite, it doesn’t feel that way.
To understand why we need to look at the blue line – average real worker wages. From WWII to the late 1970s or so, as productivity rose so did the average real wage of nearly all American workers. Year by year most working people saw their standard of living rise. For every one percent increase in productivity, about two-thirds went to labor, and the remaining one-third went to capital investment and profits.
After the late 1970s, however, the story changes dramatically. Real wages stall for more than 40 years, while productivity continues to climb. Had wages and productivity continued to rise in tandem, the average weekly earnings of the American worker would be almost double what it is today, rising from today’s average of $746 per week to $1,377 per week.
What happened? Where did all the money go that once went to working people?
The fatalistic story sold to us by elite-funded think tanks is that the rise of international competition and the introduction of advanced technology crushed the wages of those without the highest skill levels. The typical worker, it is claimed, is now competing with cheap labor from around the world and hence sees his or her wages stall and even decline. And since there really isn’t much anyone can do about globalization or automation, there’s nothing we can do about the stalling wages. Such a convenient story to justify runaway inequality!
The real story is far more complex and troubling. Yes, globalization and automation contribute to stagnant wages. But as the International Labor Organization shows in their remarkable 2012 study, only about 30 percent of this wage stagnation can be attributed to technology and globalization. The main cause is the neo-liberal policy agenda of deregulation of finance, cuts in social spending and attacks on labor unions. And within that mix the biggest driver of wage stagnation can be attributed to financialization – the deregulation of Wall Street which permitted — for the first time since the Great Depression — the rapacious financial strip-mining of workers, students, families and communities. Put simply, the neo-liberal model ushered in by Thatcher and Reagan, and then intensified by Clinton and Blair, moved the wealth that once flowed to working people to financial and corporate elites. (For a more thorough account see Professor William Lazonick’s “Profits without Prosperity.” )
How much money are we losing?
More than we can imagine. Here’s a back-of-the-envelope estimate for just the most recent year on the chart, 2017: The gap between the productivity wage and the current average wage is $631 per week. That is, if the average weekly wage continued to rise with productivity, it would be $631 higher than the current average wage. In 2017, there are 103 million of these workers. So the total amount of “lost” wages that flowed to economic elites is a whopping $3.4 trillion! (103 million x $631 x 52 weeks) And that’s just for one year.
Here’s how to pay for Medicare for All
Current estimates for a single payer system come in at about $3 trillion per year. Americans already are paying $1.9 trillion in payroll and income taxes that go to Medicare, Medicaid and other government health programs. So, we would need to raise another $1.1 trillion to provide universal healthcare and prescription drugs with no co-pays, no deductibles and no premiums.
In the name of fairness, that additional $1.1 trillion to pay for Medicare for All should be raised by taxing back a portion of the $3.4 trillion in wealth that has flowed to the super rich instead of to working people. After all, working people have not had a real wage increase in more than forty years as economic elites have siphoned away tens of trillions of dollars of income that once went to working people, (the gap between the two lines). Wouldn’t it be more than fair to ask the superrich to pay the $1.1 trillion needed for Medicare for All?
There are numerous ways for these economic elites pay their fair share:
The point is to give Americans what they have been long denied – high quality universal healthcare AND a real wage increase by providing Medicare for All with no co-pays, no deductibles and no premiums.
Think about how much your income would go up if you didn’t have to pay for healthcare at all. That would begin to close the gap between productivity and wages for the first time in a generation.
Wait! Shouldn’t working people pay something for health care coverage?
We already are. We pay payroll taxes for Medicare and a portion of our regular taxes go to fund Medicaid and other public health programs for veterans, Native Americans and public health-related research and regulation.
So next time you hear someone say we can’t afford a public good, that we need to tighten our belts and get used to austerity, think about all that wealth that has flowed to the very top. Think about that big fat gap between productivity and real wages. Think about how runaway inequality has allowed the wealth of the richest nation in history to be hijacked by the super-rich.
It’s time the American people got a real wage increase and Medicare for All would deliver just that.
RELATED: How to combat 4 common arguments against universal healthcare in the U
To many Americans this sounds all too true because they are having an incredibly tough time making ends meet. According to the Federal Reserve, “Four in 10 adults in 2017 would either borrow, sell something, or not be able pay if faced with a $400 emergency expense.” To those who are so highly stressed financially, the idea of paying for a costly program like Medicare for All sounds impossible.
We are the richest country in the history of the world, however, and certainly could afford vastly expanded and improved vital public services if we had the will. What stands in the way is runaway inequality. Our nation’s wealth has been hijacked by the super-rich, with plenty of aid from their paid-for politicians.
Over the past 40 years the top fraction of the top one percent have systematically denied working people the fruits of their enormous productivity. The results of this wage theft can be seen clearly in the chart below which tracks productivity (output per hour of labor) and average weekly wages (after accounting for inflation) of non-supervisory and production workers (about 85 percent of the workforce).
The red line shows the rise of American productivity since WWII. While not a perfect measurement of the power of our economy, it does capture our overall level of knowledge, technical skills, worker effort, management systems, and technology. Ever-rising productivity is the key to the wealth of nations.
As we can see clearly, the productivity trend has been ever upward. Today the average worker is nearly three times as productive per hour of labor as he or she was at the end of WWII. And the workforce is more than three times as large. That means we are a colossal economic powerhouse. But unless you are an economic elite, it doesn’t feel that way.
To understand why we need to look at the blue line – average real worker wages. From WWII to the late 1970s or so, as productivity rose so did the average real wage of nearly all American workers. Year by year most working people saw their standard of living rise. For every one percent increase in productivity, about two-thirds went to labor, and the remaining one-third went to capital investment and profits.
After the late 1970s, however, the story changes dramatically. Real wages stall for more than 40 years, while productivity continues to climb. Had wages and productivity continued to rise in tandem, the average weekly earnings of the American worker would be almost double what it is today, rising from today’s average of $746 per week to $1,377 per week.
What happened? Where did all the money go that once went to working people?
The fatalistic story sold to us by elite-funded think tanks is that the rise of international competition and the introduction of advanced technology crushed the wages of those without the highest skill levels. The typical worker, it is claimed, is now competing with cheap labor from around the world and hence sees his or her wages stall and even decline. And since there really isn’t much anyone can do about globalization or automation, there’s nothing we can do about the stalling wages. Such a convenient story to justify runaway inequality!
The real story is far more complex and troubling. Yes, globalization and automation contribute to stagnant wages. But as the International Labor Organization shows in their remarkable 2012 study, only about 30 percent of this wage stagnation can be attributed to technology and globalization. The main cause is the neo-liberal policy agenda of deregulation of finance, cuts in social spending and attacks on labor unions. And within that mix the biggest driver of wage stagnation can be attributed to financialization – the deregulation of Wall Street which permitted — for the first time since the Great Depression — the rapacious financial strip-mining of workers, students, families and communities. Put simply, the neo-liberal model ushered in by Thatcher and Reagan, and then intensified by Clinton and Blair, moved the wealth that once flowed to working people to financial and corporate elites. (For a more thorough account see Professor William Lazonick’s “Profits without Prosperity.” )
How much money are we losing?
More than we can imagine. Here’s a back-of-the-envelope estimate for just the most recent year on the chart, 2017: The gap between the productivity wage and the current average wage is $631 per week. That is, if the average weekly wage continued to rise with productivity, it would be $631 higher than the current average wage. In 2017, there are 103 million of these workers. So the total amount of “lost” wages that flowed to economic elites is a whopping $3.4 trillion! (103 million x $631 x 52 weeks) And that’s just for one year.
Here’s how to pay for Medicare for All
Current estimates for a single payer system come in at about $3 trillion per year. Americans already are paying $1.9 trillion in payroll and income taxes that go to Medicare, Medicaid and other government health programs. So, we would need to raise another $1.1 trillion to provide universal healthcare and prescription drugs with no co-pays, no deductibles and no premiums.
In the name of fairness, that additional $1.1 trillion to pay for Medicare for All should be raised by taxing back a portion of the $3.4 trillion in wealth that has flowed to the super rich instead of to working people. After all, working people have not had a real wage increase in more than forty years as economic elites have siphoned away tens of trillions of dollars of income that once went to working people, (the gap between the two lines). Wouldn’t it be more than fair to ask the superrich to pay the $1.1 trillion needed for Medicare for All?
There are numerous ways for these economic elites pay their fair share:
- Financial transaction tax on stock, bond and derivative transactions;
- Wealth tax on those with over $50 million in wealth;
- Minimum corporate tax of 35 percent on all corporations with over $100 million in profits who now pay little or nothing like Amazon did this past year;
- Raise the marginal tax bracket to 70% on all income over $10 million per year;
- Increase the inheritance tax on the super-rich to prevent the creation of a permanent oligarchy…and so on.
The point is to give Americans what they have been long denied – high quality universal healthcare AND a real wage increase by providing Medicare for All with no co-pays, no deductibles and no premiums.
Think about how much your income would go up if you didn’t have to pay for healthcare at all. That would begin to close the gap between productivity and wages for the first time in a generation.
Wait! Shouldn’t working people pay something for health care coverage?
We already are. We pay payroll taxes for Medicare and a portion of our regular taxes go to fund Medicaid and other public health programs for veterans, Native Americans and public health-related research and regulation.
So next time you hear someone say we can’t afford a public good, that we need to tighten our belts and get used to austerity, think about all that wealth that has flowed to the very top. Think about that big fat gap between productivity and real wages. Think about how runaway inequality has allowed the wealth of the richest nation in history to be hijacked by the super-rich.
It’s time the American people got a real wage increase and Medicare for All would deliver just that.
RELATED: How to combat 4 common arguments against universal healthcare in the U
Capitalism and democracy: the strain is showing
by Martin Wolf
From Financial Times: To maintain legitimacy, economic policy must seek to promote the interests of the many not the few
Is the marriage between liberal democracy and global capitalism an enduring one? Political developments across the west — particularly the candidacy of an authoritarian populist for the presidency of the most important democracy — heighten the importance of this question. One cannot take for granted the success of the political and economic systems that guide the western world and have been a force of attraction for much of the rest for four decades. The question then arises: if not these, what?
A natural connection exists between liberal democracy — the combination of universal suffrage with entrenched civil and personal rights — and capitalism, the right to buy and sell goods, services, capital and one’s own labour freely. They share the belief that people should make their own choices as individuals and as citizens. Democracy and capitalism share the assumption that people are entitled to exercise agency. Humans must be viewed as agents, not just as objects of other people’s power.
Yet it is also easy to identify tensions between democracy and capitalism. Democracy is egalitarian. Capitalism is inegalitarian, at least in terms of outcomes. If the economy flounders, the majority might choose authoritarianism, as in the 1930s. If economic outcomes become too unequal, the rich might turn democracy into plutocracy.
Historically, the rise of capitalism and the pressure for an ever- broader suffrage went together. This is why the richest countries are liberal democracies with, more or less, capitalist economies. Widely shared increases in real incomes played a vital part in legitimising capitalism and stabilising democracy. Today, however, capitalism is finding it far more difficult to generate such improvements in prosperity. On the contrary, the evidence is of growing inequality and slowing productivity growth. This poisonous brew makes democracy intolerant and capitalism illegitimate.
Today’s capitalism is global. This, too, can be regarded as natural. Left to themselves, capitalists will not limit their activities to any given jurisdiction. If opportunities are global so, too, will be their activities. So, as a result, are economic organisations, particularly big companies.
Yet, as Professor Dani Rodrik of Harvard University has noted, globalisation constrains national autonomy. He writes that “democracy, national sovereignty and global economic integration are mutually incompatible: we can combine any two of the three but never have all three simultaneously and in full”. If countries are free to set national regulations, the freedom to buy and sell across frontiers will be reduced. Alternatively, if barriers are removed and regulations harmonised, the legislative autonomy of states will be limited. Freedom of capital to cross borders is particularly likely to constrain states’ ability to set their own taxes and regulations.
Moreover, a common feature of periods of globalisation is mass migration. Movement across borders creates the most extreme conflict between individual liberty and democratic sovereignty. The former says that people should be allowed to move where they like. The latter says that citizenship is a collective property right, access to which citizens control. Meanwhile, businesses view the ability to hire freely as invaluable. It is not merely unsurprising that migration has become the lightning rod of contemporary democratic politics. Migration is bound to create friction between national democracy and global economic opportunity.
Consider the disappointing recent performance of global capitalism, not least the shock of the financial crisis and its devastating effect on trust in the elites in charge of our political and economic arrangements. Given all this, confidence in an enduring marriage between liberal democracy and global capitalism seems unwarranted.
So what might take its place? One possibility would be the rise of a global plutocracy and so in effect the end of national democracies. As in the Roman empire, the forms of republics might endure but the reality would be gone.
An opposite alternative would be the rise of illiberal democracies or outright plebiscitary dictatorships, in which the elected ruler exercises control over both the state and capitalists. This is happening in Russia and Turkey. Controlled national capitalism would then replace global capitalism. Something rather like that happened in the 1930s. It is not hard to identify western politicians who would love to go in exactly this direction.
Meanwhile, those of us who wish to preserve both liberal democracy and global capitalism must confront serious questions. One is whether it makes sense to promote further international agreements that tightly constrain national regulatory discretion in the interests of existing corporations. My view increasingly echoes that of Prof Lawrence Summers of Harvard, who has argued that “international agreements [should] be judged not by how much is harmonised or by how many barriers are torn down but whether citizens are empowered”. Trade brings gains but cannot be pursued at all costs.
Above all, if the legitimacy of our democratic political systems is to be maintained, economic policy must be orientated towards promoting the interests of the many not the few; in the first place would be the citizenry, to whom the politicians are accountable. If we fail to do this, the basis of our political order seems likely to founder. That would be good for no one. The marriage of liberal democracy with capitalism needs some nurturing. It must not be taken for granted.
Is the marriage between liberal democracy and global capitalism an enduring one? Political developments across the west — particularly the candidacy of an authoritarian populist for the presidency of the most important democracy — heighten the importance of this question. One cannot take for granted the success of the political and economic systems that guide the western world and have been a force of attraction for much of the rest for four decades. The question then arises: if not these, what?
A natural connection exists between liberal democracy — the combination of universal suffrage with entrenched civil and personal rights — and capitalism, the right to buy and sell goods, services, capital and one’s own labour freely. They share the belief that people should make their own choices as individuals and as citizens. Democracy and capitalism share the assumption that people are entitled to exercise agency. Humans must be viewed as agents, not just as objects of other people’s power.
Yet it is also easy to identify tensions between democracy and capitalism. Democracy is egalitarian. Capitalism is inegalitarian, at least in terms of outcomes. If the economy flounders, the majority might choose authoritarianism, as in the 1930s. If economic outcomes become too unequal, the rich might turn democracy into plutocracy.
Historically, the rise of capitalism and the pressure for an ever- broader suffrage went together. This is why the richest countries are liberal democracies with, more or less, capitalist economies. Widely shared increases in real incomes played a vital part in legitimising capitalism and stabilising democracy. Today, however, capitalism is finding it far more difficult to generate such improvements in prosperity. On the contrary, the evidence is of growing inequality and slowing productivity growth. This poisonous brew makes democracy intolerant and capitalism illegitimate.
Today’s capitalism is global. This, too, can be regarded as natural. Left to themselves, capitalists will not limit their activities to any given jurisdiction. If opportunities are global so, too, will be their activities. So, as a result, are economic organisations, particularly big companies.
Yet, as Professor Dani Rodrik of Harvard University has noted, globalisation constrains national autonomy. He writes that “democracy, national sovereignty and global economic integration are mutually incompatible: we can combine any two of the three but never have all three simultaneously and in full”. If countries are free to set national regulations, the freedom to buy and sell across frontiers will be reduced. Alternatively, if barriers are removed and regulations harmonised, the legislative autonomy of states will be limited. Freedom of capital to cross borders is particularly likely to constrain states’ ability to set their own taxes and regulations.
Moreover, a common feature of periods of globalisation is mass migration. Movement across borders creates the most extreme conflict between individual liberty and democratic sovereignty. The former says that people should be allowed to move where they like. The latter says that citizenship is a collective property right, access to which citizens control. Meanwhile, businesses view the ability to hire freely as invaluable. It is not merely unsurprising that migration has become the lightning rod of contemporary democratic politics. Migration is bound to create friction between national democracy and global economic opportunity.
Consider the disappointing recent performance of global capitalism, not least the shock of the financial crisis and its devastating effect on trust in the elites in charge of our political and economic arrangements. Given all this, confidence in an enduring marriage between liberal democracy and global capitalism seems unwarranted.
So what might take its place? One possibility would be the rise of a global plutocracy and so in effect the end of national democracies. As in the Roman empire, the forms of republics might endure but the reality would be gone.
An opposite alternative would be the rise of illiberal democracies or outright plebiscitary dictatorships, in which the elected ruler exercises control over both the state and capitalists. This is happening in Russia and Turkey. Controlled national capitalism would then replace global capitalism. Something rather like that happened in the 1930s. It is not hard to identify western politicians who would love to go in exactly this direction.
Meanwhile, those of us who wish to preserve both liberal democracy and global capitalism must confront serious questions. One is whether it makes sense to promote further international agreements that tightly constrain national regulatory discretion in the interests of existing corporations. My view increasingly echoes that of Prof Lawrence Summers of Harvard, who has argued that “international agreements [should] be judged not by how much is harmonised or by how many barriers are torn down but whether citizens are empowered”. Trade brings gains but cannot be pursued at all costs.
Above all, if the legitimacy of our democratic political systems is to be maintained, economic policy must be orientated towards promoting the interests of the many not the few; in the first place would be the citizenry, to whom the politicians are accountable. If we fail to do this, the basis of our political order seems likely to founder. That would be good for no one. The marriage of liberal democracy with capitalism needs some nurturing. It must not be taken for granted.
Almost Two-Thirds of People in the Labor Force Do Not Have a College Degree, and Their Job Prospects Are Dimming
Stable, middle class jobs are on the decline for those without college degrees. The Trans-Pacific Partnership is not helping.
By Robert Scott, David Cooper / Economic Policy Institute
Almost two-thirds of people in the labor force (65.1 percent) do not have a college degree. In fact, people without a college degree (which includes those without a high school degree, with a high school degree, some college education, and an associates’ degrees) make up the majority of the labor force in every state but the District of Columbia. Mississippi has the highest share of non-college educated workers (75.7 percent) while Massachusetts and the District of Columbia have the lowest shares (51 percent and 33.7 percent, respectively).
It is no secret that wages for typical workers have stagnated over the past 35 years. The lagging recovery of construction and manufacturing sectors, two sectors which traditionally provide strong wages for workers without college degrees, is one reason for this wage stagnation. Trade agreements like the Trans-Pacific Partnership threaten to make the possibility of strong, middle-class jobs even more elusive for non-college educated workers.
We cannot solve the problem of low and stagnating wages for non-college educated workers by expecting everyone to pursue more education. We need solutions that will raise wages for all workers, regardless of educational attainment. These solutions include raising the minimum wage, strengthening collective bargaining rights, prioritizing very low rates of unemployment through monetary policy, and reducing our trade deficit by stopping destructive currency manipulation.
It is no secret that wages for typical workers have stagnated over the past 35 years. The lagging recovery of construction and manufacturing sectors, two sectors which traditionally provide strong wages for workers without college degrees, is one reason for this wage stagnation. Trade agreements like the Trans-Pacific Partnership threaten to make the possibility of strong, middle-class jobs even more elusive for non-college educated workers.
We cannot solve the problem of low and stagnating wages for non-college educated workers by expecting everyone to pursue more education. We need solutions that will raise wages for all workers, regardless of educational attainment. These solutions include raising the minimum wage, strengthening collective bargaining rights, prioritizing very low rates of unemployment through monetary policy, and reducing our trade deficit by stopping destructive currency manipulation.
Top 10 Ways the US is the Most Corrupt Country in the World
By Juan Cole
From Informed Consent: Those ratings that castigate Afghanistan and some other poor countries as hopelessly “corrupt” always imply that the United States is not corrupt.
While it is true that you don’t typically have to bribe your postman to deliver the mail in the US, in many key ways America’s political and financial practices make it in absolute terms far more corrupt than the usual global South suspects. After all, the US economy is worth over $16 trillion a year, so in our corruption a lot more money changes hands.
1. Instead of having short, publicly-funded political campaigns with limited and/or free advertising (as a number of Western European countries do), the US has long political campaigns in which candidates are dunned big bucks for advertising. They are therefore forced to spend much of their time fundraising, which is to say, seeking bribes. All American politicians are basically on the take, though many are honorable people. They are forced into it by the system. House Majority leader John Boehner has actually just handed out cash on the floor of the House from the tobacco industry to other representatives.
When French President Nicolas Sarkozy was defeated in 2012, soon thereafter French police actually went into his private residence searching for an alleged $50,000 in illicit campaign contributions from the L’Oreale heiress. I thought to myself, seriously? $50,000 in a presidential campaign? Our presidential campaigns cost a billion dollars each! $50,000 is a rounding error, not a basis for police action. Why, George W. Bush took millions from arms manufacturers and then ginned up a war for them, and the police haven’t been anywhere near his house.
American politicians don’t represent “the people.” With a few honorable exceptions, they represent the the 1%. American democracy is being corrupted out of existence.
2. That politicians can be bribed to reduce regulation of industries like banking (what is called “regulatory capture”) means that they will be so bribed. Billions were spent and 3,000 lobbyists employed by bankers to remove cumbersome rules in the zeroes. Thus, political corruption enabled financial corruption (in some cases legalizing it!) Without regulations and government auditing, the finance sector went wild and engaged in corrupt practices that caused the 2008 crash. Too bad the poor Afghans can’t just legislate their corruption out of existence by regularizing it, the way Wall street did.
3. That the chief villains of the 2008 meltdown (from which 90% of Americans have not recovered) have not been prosecuted is itself a form of corruption.
4. The US military budget is bloated and enormous, bigger than the military budgets of the next twelve major states. What isn’t usually realized is that perhaps half of it is spent on outsourced services, not on the military. It is corporate welfare on a cosmic scale. I’ve seen with my own eyes how officers in the military get out and then form companies to sell things to their former colleagues still on the inside.
5. The US has a vast gulag of 2.2 million prisoners in jail and penitentiary. There is an increasing tendency for prisons to be privatized, and this tendency is corrupting the system. It is wrong for people to profit from putting and keeping human beings behind bars. This troubling trend is made all the more troubling by the move to give extra-long sentences for minor crimes, to deny parole and to imprison people for life for e,g, three small thefts.
6. The rich are well placed to bribe our politicians to reduce taxes on the rich. This and other government policies has produced a situation where 400 American billionaires are worth $2 trillion, as much as the bottom 150 million Americans. That kind of wealth inequality hasn’t been seen in the US since the age of the robber barons in the nineteenth century. Both eras are marked by extreme corruption.
7. The National Security Agency’s domestic spying is a form of corruption in itself, and lends itself to corruption. With some 4 million government employees and private contractors engaged in this surveillance, it is highly unlikely that various forms of insider trading and other corrupt practices are not being committed. If you knew who Warren Buffett and George Soros were calling every day, that alone could make you a killing. The American political class wouldn’t be defending this indefensible invasion of citizens’ privacy so vigorously if someone somewhere weren’t making money on it.
8. As for insider trading, it turns out Congress undid much of the law it hastily passed forbidding members, rather belatedly, to engage in insider trading (buying and selling stock based on their privileged knowledge of future government policy). That this practice only became an issue recently is another sign of how corrupt the system is.
9. Asset forfeiture in the ‘drug war’ is corrupting police departments and the judiciary.
10. Money and corruption have seeped so far into our media system that people can with a straight face assert that scientists aren’t sure human carbon emissions are causing global warming. Fox Cable News is among the more corrupt institutions in American society, purveying outright lies for the benefit of the billionaire class. The US is so corrupt that it is resisting the obvious urgency to slash carbon production. Even our relatively progressive president talks about exploiting all sources of energy, as though hydrocarbons were just as valuable as green energy and as though hydrocarbons weren’t poisoning the earth.
Even Qatar, its economy based on natural gas, freely admits the challenge of human-induced climate change. American politicians like Jim Inhofe are openly ridiculed when they travel to Europe for their know-nothingism on climate.
So don’t tell the Philippines or the other victims of American corruption how corrupt they are for taking a few petty bribes. Americans are not seen as corrupt because we only deal in the big denominations. Steal $2 trillion and you aren’t corrupt, you’re respectable.
While it is true that you don’t typically have to bribe your postman to deliver the mail in the US, in many key ways America’s political and financial practices make it in absolute terms far more corrupt than the usual global South suspects. After all, the US economy is worth over $16 trillion a year, so in our corruption a lot more money changes hands.
1. Instead of having short, publicly-funded political campaigns with limited and/or free advertising (as a number of Western European countries do), the US has long political campaigns in which candidates are dunned big bucks for advertising. They are therefore forced to spend much of their time fundraising, which is to say, seeking bribes. All American politicians are basically on the take, though many are honorable people. They are forced into it by the system. House Majority leader John Boehner has actually just handed out cash on the floor of the House from the tobacco industry to other representatives.
When French President Nicolas Sarkozy was defeated in 2012, soon thereafter French police actually went into his private residence searching for an alleged $50,000 in illicit campaign contributions from the L’Oreale heiress. I thought to myself, seriously? $50,000 in a presidential campaign? Our presidential campaigns cost a billion dollars each! $50,000 is a rounding error, not a basis for police action. Why, George W. Bush took millions from arms manufacturers and then ginned up a war for them, and the police haven’t been anywhere near his house.
American politicians don’t represent “the people.” With a few honorable exceptions, they represent the the 1%. American democracy is being corrupted out of existence.
2. That politicians can be bribed to reduce regulation of industries like banking (what is called “regulatory capture”) means that they will be so bribed. Billions were spent and 3,000 lobbyists employed by bankers to remove cumbersome rules in the zeroes. Thus, political corruption enabled financial corruption (in some cases legalizing it!) Without regulations and government auditing, the finance sector went wild and engaged in corrupt practices that caused the 2008 crash. Too bad the poor Afghans can’t just legislate their corruption out of existence by regularizing it, the way Wall street did.
3. That the chief villains of the 2008 meltdown (from which 90% of Americans have not recovered) have not been prosecuted is itself a form of corruption.
4. The US military budget is bloated and enormous, bigger than the military budgets of the next twelve major states. What isn’t usually realized is that perhaps half of it is spent on outsourced services, not on the military. It is corporate welfare on a cosmic scale. I’ve seen with my own eyes how officers in the military get out and then form companies to sell things to their former colleagues still on the inside.
5. The US has a vast gulag of 2.2 million prisoners in jail and penitentiary. There is an increasing tendency for prisons to be privatized, and this tendency is corrupting the system. It is wrong for people to profit from putting and keeping human beings behind bars. This troubling trend is made all the more troubling by the move to give extra-long sentences for minor crimes, to deny parole and to imprison people for life for e,g, three small thefts.
6. The rich are well placed to bribe our politicians to reduce taxes on the rich. This and other government policies has produced a situation where 400 American billionaires are worth $2 trillion, as much as the bottom 150 million Americans. That kind of wealth inequality hasn’t been seen in the US since the age of the robber barons in the nineteenth century. Both eras are marked by extreme corruption.
7. The National Security Agency’s domestic spying is a form of corruption in itself, and lends itself to corruption. With some 4 million government employees and private contractors engaged in this surveillance, it is highly unlikely that various forms of insider trading and other corrupt practices are not being committed. If you knew who Warren Buffett and George Soros were calling every day, that alone could make you a killing. The American political class wouldn’t be defending this indefensible invasion of citizens’ privacy so vigorously if someone somewhere weren’t making money on it.
8. As for insider trading, it turns out Congress undid much of the law it hastily passed forbidding members, rather belatedly, to engage in insider trading (buying and selling stock based on their privileged knowledge of future government policy). That this practice only became an issue recently is another sign of how corrupt the system is.
9. Asset forfeiture in the ‘drug war’ is corrupting police departments and the judiciary.
10. Money and corruption have seeped so far into our media system that people can with a straight face assert that scientists aren’t sure human carbon emissions are causing global warming. Fox Cable News is among the more corrupt institutions in American society, purveying outright lies for the benefit of the billionaire class. The US is so corrupt that it is resisting the obvious urgency to slash carbon production. Even our relatively progressive president talks about exploiting all sources of energy, as though hydrocarbons were just as valuable as green energy and as though hydrocarbons weren’t poisoning the earth.
Even Qatar, its economy based on natural gas, freely admits the challenge of human-induced climate change. American politicians like Jim Inhofe are openly ridiculed when they travel to Europe for their know-nothingism on climate.
So don’t tell the Philippines or the other victims of American corruption how corrupt they are for taking a few petty bribes. Americans are not seen as corrupt because we only deal in the big denominations. Steal $2 trillion and you aren’t corrupt, you’re respectable.