Allied Health is a term used to describe the broad range of health professionals who are not doctors, dentists or nurses. Allied health professionals aim to prevent, diagnose and treat a range of conditions and illnesses and often work within …