2 answers
Asked
910 views
Once I have a degree, will my employers care what my degree is in?
#after college
Login to comment
2 answers
Updated
Rachel’s Answer
Hi Anna,
Your degree will likely be most relevant right after college, because you'll be using it to start your professional career. As you gain experience in a field and transition to other roles, your job experience will be more impactful than your degree title. However, I do think for a lot of job areas your degree is going to hold at least a little weight as they consider you for a role.
Best of luck to you!
Your degree will likely be most relevant right after college, because you'll be using it to start your professional career. As you gain experience in a field and transition to other roles, your job experience will be more impactful than your degree title. However, I do think for a lot of job areas your degree is going to hold at least a little weight as they consider you for a role.
Best of luck to you!
Updated
Angela’s Answer
Hi Anna,
It depends on your career path. There are some occupations that require a particular degree while others have 'recommended' degrees. Either way, experience is very important. If you do not have a degree in the subject manner make sure you get enough real world experience to talk about during interviews.
Good luck!