Skip to main content
2 answers
2
Updated 537 views

While in college is getting an internship the best thing to do.

Getting a secure job after college is the thing ,that most people worry about. This is a question that most people have because college is suppose to help people with their career, but in most cases when people get out of school they can never find a career, and have to work low end jobs when they have college degrees.# Sports Management #career

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

2

2 answers


0
Updated
Share a link to this answer
Share a link to this answer

Ashley’s Answer

Hi Ayanna,


I agree with Tushya and wanted to add that internships are also a possible entrance field into a company/industry. Not only does it give you great experience, but it's an opportunity to show your skill and drive to promote yourself for a full-time employment offer. Be sure to weigh the benefit of experience and exposure within your chosen field when comparing paid versus unpaid internships.


Also, internships are another great way to help focus your goal and/or determine if a chosen field is what you want to truly major in. It allows you to explore specialties and sub-groups within your field and help map out your early career.


Good luck!

0
0
Updated
Share a link to this answer
Share a link to this answer

Tushya’s Answer

Hey,

From my personal opinion, i would suggest getting an internship during college. I myself did three internships during my college years. How i feel it helps is that it exposes you to a working environment in a way that college doesn't. Your mindset opens in terms of attitude, performance, discipline, and many other factors. Another benefit is that after interning, you then have the experience based on which you can decide if you prefer being in a working environment or in an academic oriented one, which would then help you choose between a corporate or research life.

I hope this answer helps you.
0