Have you ever taken a selfie? If you're unfamiliar with the term, selfies are when you snap a solo shot of yourself. Selfies have received a lot of hate recently for encouraging vanity. Check out this video to understand why that hate is overblown and how selfies are feminist, body positive, and sex positive. The video also includes nine tips to improve your body image with your selfies.
Via Deanna Dahlsad