Oh. No. Not in most jobs. Many jobs do provide some health care.
If you are working many jobs in the US you get no health care. You have to pay for it yourself. Even jobs that provide it you still need to pay for it. The employer basically pays a portion of the insurance bill. Good employers pay a lot, bad employers pay none.
Then you have deductibles. The amount you have to pay out of pocket every year before insurance does anything. If you have a ten thousand dollar deductible, insurance only kicks in at $10,001 and beyond.