By Kalena Thomhave | Feb 12, 2018
In a bid to combat drowsy driving, Uber recently announced a new policy limiting drivers to 12-hour shifts without breaks. After 12 hours, the app will go offline, and drivers must take at least a six-hour break.
While the effort to encourage safer driving is laudable, one must ask: Uber drivers sometimes work nonstop for 12 straight hours? That doesn’t sound like a “side hustle,” which is how Uber markets the job.
But so many Uber drivers work until they’re exhausted that the company decided to force them off the road, instead of paying them more to work fewer hours.
Indeed, it shouldn’t be surprising that some Uber drivers find themselves nodding off after a long shift. While many drivers work for Uber to supplement their regular pay, others drive for Uber full-time. Uber drivers do not have workplace protections like a minimum wage—and that encourages workers to push themselves to drive for long hours to pay their bills. After all, “setting your own schedule” is a major incentive to drive for Uber.
Uber drivers also get the privilege of setting their own benefits, since the company doesn’t provide them with any. In this sector of the “gig economy,” drivers don’t get benefits like health insurance or retirement accounts, so if drivers want these things, they have to pay for them.
Yes, we should keep sleepy Uber drivers off the road. One way to do that could be to pay them more.
By Kalena Thomhave | Feb 06, 2018
A proposed Department of Labor rule would allow employers to pocket their employees’ tips. The proposed rule in no way requires that these pocketed tips are distributed among employees—employers could simply take them (a fact the DOL tried hard to cover up). The Economic Policy Institute estimates that the rule would cause workers to lose $5.8 billion in tips per year. While being rightfully outraged by this prospect, we should revisit why tipping exists in the United States in the first place.
In the late 1800s, wealthy Americans brought home from aristocratic Europe the bourgeois practice of tipping, meaning to impress by providing inferior laborers with spare change. And many employers were delighted at being able to hire formerly enslaved African Americans and pay them nothing, making them rely solely on tips.
Yet Americans were angered by tipping, claiming that it was anti-democratic and would only contribute to classism. A union-led movement against tipping in the early 1900s saw six states ban tipping altogether.
But as we know, that movement fell apart in the United States (though not in Europe), and tipping is now an ingrained standard in American society. And just as its racist and classist history would predict: Black workers receive less in tips than their white counterparts, sexism plays a role in who receives the highest tips, and nearly one-fifth of tipped workers in states that ascribe to the federal minimum tipped wage live in poverty.
Calls for a higher minimum wage don’t often include the tipped wage, which has stubbornly remained at $2.13 since the 1990s. Sure, restaurants are required to ensure that tipped employees receive at least the federal minimum wage, but that doesn’t always happen. And sure, many employees prefer receiving tips because there’s the chance they could make many times more than the minimum wage—but that is by no means typical for the average tipped employee: The median hourly wage for servers was $9.61 in 2016.
Once a practice becomes the norm, it’s easy to forget the discriminatory history and oppressive institutions that set it in motion in the first place. The DOL’s proposed tip-stealing rule could add yet another chapter to tipping’s long, unjust history.
By Kalena Thomhave | Feb 02, 2018
The new Secretary of Health and Human Services, Alex Azar, will announce today that Indiana will follow Kentucky's lead and receive approval to implement work requirements in the state’s Medicaid program, according to a Politico report. Last month, the Trump administration signaled they’d allow requiring work for low-income people seeking health-care assistance, and Kentucky quickly became the first state to receive the greenlight to radically change their Medicaid model.
Indiana actually inspired some parts of Kentucky’s plan, as the state has included aspects of “consumer-driven” health insurance, like premium payments, in its Medicaid program since 2015. Data show that 25,000 Medicaid recipients were dropped from Indiana’s Medicaid program between 2015 and 2017 for failing to pay their premiums.
Now, like Kentucky, Indiana will be adding work requirements to the mix.
But not so fast. Three organizations have brought a lawsuit against the state of Kentucky on behalf of 15 Kentucky Medicaid recipients, alleging that forcing Medicaid recipients to work to continue receiving health care is a violation of federal law. The Kentucky Equal Justice Center, the Southern Poverty Law Center, and the National Health Law Program are arguing that the Trump administration’s willingness to allow work requirements and their approval of Kentucky’s plan to restructure Medicaid “are unauthorized attempts to re-write the Medicaid Act.” As I reported in September, nearly 100,000 Kentuckians are expected to lose Medicaid as a result of the approval to change the state’s program.
The Obama administration resisted allowing work requirements in Medicaid, reasoning that such requirements, which would reduce coverage, were inconsistent with the purpose of the program: to provide health care to low-income people.
As more states add a work requirement to Medicaid receipt, the benefits of Medicaid expansion (more people accessing preventive care and folks getting healthier) will begin to erode. While states with conservative governors may be willing to expand Medicaid if it means they can require poor people to work, this would be a Pyrrhic victory for the left: Work requirements mean that the neediest people won’t receive care, and they reinforce the idea that assistance should be given only to (a harmful assumption of) who is most “deserving.”
TAP Goes to the Oscars: The Florida Project is a film about life as a poor kid. It doesn’t erase the innocence of childhood—or the harshness of poverty.Kalena ThomhaveFeb 02, 2018
By Kalena Thomhave | Feb 01, 2018
Under pressure from Major League Baseball, the Cleveland Indians announced this week that beginning in 2019, they’ll retire the Chief Wahoo mascot—the cartoonish, red-faced figure that’s meant to depict a Native American chief—but only from on-field team uniforms.
“We have consistently maintained that we are cognizant and sensitive to both sides of the discussion,” said Paul Dolan, the owner of the team. And in fact, they are trying to please “both sides” by retiring Wahoo on the field, but not from merchandise sold by the Indians organization, allowing it to keep profiting from the logo.
Opponents argue that these depictions “honor” Native Americans, but studies have shown that stereotype-based mascots and related imagery in sports have real, damaging psychological and social consequences for Native Americans—and they especially impact the development and self-esteem of Native youth.
In a statement, MLB, which will no longer be selling Wahoo apparel in its official shop, said the mascot “was no longer appropriate.” Was it ever? Native Americans have been calling for the removal of Wahoo for decades, most recently with the #NotYourMascot campaign. And while this move is a step in the right direction, activists were quick to point out that the team name itself needs changing, too. (There’s a movement in Cleveland to change the name to the Spiders, the name of the city’s baseball team in the late 1800s.)
There’s a certain football team that I’ll only call “the Washington team” that might want to revisit its branding next.