Seeing a trend build up about a lot of Americans who are unhappy with the way America works right now.

From low wages, 0 benefits and lack of protection and help especially compared to European countries.

What's your thoughts on American Jobs at this point? What's your thoughts on the Job Perks and government benefits?
Take into account this is for the majority of Americans, and don't just think about your own role and status.