Skip to main content
4 answers
4
Asked 374 views

who created the nursing career in united states?

I am interested in knowing where nursing was born in the United States. What is the starting point

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

4

4 answers


1
Updated
Share a link to this answer
Share a link to this answer

Joshua’s Answer

There is really no clear answer as aspects of nursing have been around for a long time and since modern nursing was founded in England and brought to the US from there. In school we did hear a lot about important figures like Clara Barton, Dorothea Dix, and Lillian Wald who made the nursing profession what it is today in the US. Florence Nightingale is probably the best point to look for the start of modern nursing and modern nursing schools.
1
0
Updated
Share a link to this answer
Share a link to this answer

Kathy’s Answer

Hi, Saira~
As you can see from the responses above, there have always been men and women who have stepped up to care for family and community members who fall ill, and long before the name "nurse" was given to them. As early as the late 1400s, Spanish explorers of the "new world" brought monks with them to care for the ill and the people who already lived here had shamans and healers.
So many nurses have been true pioneers in our profession and heroes for the people under their care. Here's a link to a wonderful book that records many of their (forgotten) stories:
https://www.amazon.com/Nursing-Illuminations-Patricia-van-Betten/dp/0323025846/ref=sr_1_2?crid=1TUQN0RV9Y4OX&keywords=melisa+moriarty&qid=1654738979&sprefix=melisa+moriarty%2Caps%2C198&sr=8-2
Best wishes to you as you continue on your path. We all stand on the shoulders of those who came before us. Congratulations to you for wanting to know who these truly inspiring women and men were.
Kathy
0
0
Updated
Share a link to this answer
Share a link to this answer

Hemant’s Answer

It's a noble profession.The day disease entered in our living ,nursing started since then. Doctors cannot cure you alone.None of the family members have that time to spend with the patients to look after.Doctor wrote prescription and advised this n that .But your loved ones will not always available for you .Here is the role of Nurse who just not do this for money ,they save lives ,try to bring back happiness in one's life.
The US world War is an example where thousands died and millions were injured .Without Nurses ,the figures would have been different.
Your question is how it became a career in US, and I will answer it why not .it's been for ages.Its been since US not even was US. When it was just a colony under Britishers ,nursing had its roots since then.
Doctors may be seen as God ,but nurses save lives.
If Doctors are God then Nurses are Angels.
I share my daily life with a nurse and I see her everyday routine and proud to know what she does.
0
0
Updated
Share a link to this answer
Share a link to this answer

Mary Beth’s Answer

All you need to do is an internet search and there is a plethora of info on nursings history in the US. The first official nursing programs began around the 1870s, and always associated with hospitals in the area. Soon many others schools sprung up around the country. The rest is history as they say.

https://www.nursing.upenn.edu/nhhc/american-nursing-an-introduction-to-the-past/
0