Doctors are usually seen in clinics, hospitals, and other health facilities. Our job mainly involves treating patients and managing diseases in the communities. We are trained to tell a patient the nature of the disease, ways to treat the disease, and tips to prevent getting the disease. Our...