Would the college help me look for a job, that's in my major, or at least point the way for me to get some experience and insight of my future career?
I want to get a career in dental hygiene or physical therapy, and I would like to know what the job is like from my view, and not someone elses. I would want the college that I'm going to, too show me some opportunities in the job field where I can earn some experience and maybe a little training. work-life-balance savings health-education
Many colleges and universities have career centers that would help you with finding employment or internships in the field that you seek.