What is workism? It is the belief that work is not only necessary to economic production, but also the centerpiece of one’s identity and life’s purpose; and the belief that any policy to promote human welfare must always encourage more work.
Workism Is Making Americans Miserable
See also, On Burnout.