Death marks the end of your life story (). Our culture and individual backgrounds influence how we view death. In some cultures, death is accepted as a natural part of life and is embraced. In contrast, until about 50 years ago in the United States, a doctor might not inform someone that they were dying, and the majority of deaths occurred in hospitals. In 1967 that reality began to change.