Self-care has become a buzzword for advertisers trying to sell weekend vacations, spa packages, and gym memberships, but what does the term really mean? The truth is, self-care simply means purposely doing something to provide for your health and welfare “through restorative activities” because not doing so might have dire consequences, including to your pocketbook.