U.S. Department of Health and Human Services

The U.S. Department of Health and Human Services works to enhance and protect the health and well-being of all Americans by providing for effective health and human services and fostering advances in medicine, public health, and social services.