Skip to main content
2 answers
3
Asked 500 views

When working in healthcare, is it better to get your degree first, or work in an entry level job?

#GivingisCaring

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

3

2 answers


2
Updated
Share a link to this answer
Share a link to this answer

Ashley’s Answer

I would always suggest getting your degree, first, no matter what line of work you're going into. As you get older and "real life" sets in, it will be harder to go to college, and these days, a Bachelor's degree is a bare minimum for most jobs. There might be a possibility of doing both, too. Start in an entry level position, and go to school at the same time.
2
2
Updated
Share a link to this answer
Share a link to this answer

Julio’s Answer

Hey Andrea!

Like Ashley said, getting a degree is the bare minimum for most jobs, especially in healthcare. One thing to keep in mind is what specific area of healthcare you want to work in, whether it be nursing, management, administration, etc. you will need different types of degrees for specific careers, unlike other career areas such as business where with a standard Bachelor’s in business you can work in Marketing, management, and more.

Think about what specific area you want to work in and research schools and programs that will help you get certified for your career. There’s no harm in working, interning, or volunteering at hospitals while you get your degree, it is preferred and will look very good on your resume!

Hoped this helped and Good Luck!
2