When many people think of health care, the most common sectors of health are what come to mind. General practitioners, dentists, optometrists, and even some specialty healthcare like women-focused OB-GYN practitioners are usually those that are most often top of mind. However, when it comes to overall health and wellness, taking care of your skin is just as important as the other parts of your body. General dermatology is a service that’s focused on helping you care for the largest organ your body has – your skin!