Can an employer make you sign a contract?
Get Legal Help Today
Compare Quotes From Top Companies and Save
Secured with SHA-256 Encryption
Can an employer make you sign a contract?
An employer is making its employees sign a contract that states,
Asked on February 9, 2018 under Employment Labor Law, Tennessee
Answers:
M.D., Member, California and New York Bar / FreeAdvice Contributing Attorney
Answered 6 years ago | Contributor
Yes, your employer can require you to do this. The fact is that, absent some form of actionable discrimination or the breach of an employment or union ageement, an employer can set the conditions of employment much as it sees fit. This includes having workers sign a contract such as you describe. If this is unacceptable to you, you can either refuse to sign but risk termination or quit.
IMPORTANT NOTICE: The Answer(s) provided above are for general information only. The attorney providing the answer was not serving as the attorney for the person submitting the question or in any attorney-client relationship with such person. Laws may vary from state to state, and sometimes change. Tiny variations in the facts, or a fact not set forth in a question, often can change a legal outcome or an attorney's conclusion. Although AttorneyPages.com has verified the attorney was admitted to practice law in at least one jurisdiction, he or she may not be authorized to practice law in the jurisdiction referred to in the question, nor is he or she necessarily experienced in the area of the law involved. Unlike the information in the Answer(s) above, upon which you should NOT rely, for personal advice you can rely upon we suggest you retain an attorney to represent you.