Is it really necessary?
I grew up with members of my family selling life insurance policies on the side. I knew there was something one could get out of it, but I never was convinced that it was absolutely necessary.
As an adult, I have tried jumping into the "responsible bandwagon" and have looked at an insurance quote here and there. I have even bought a special insurance package that also has a mutual funds feature.
Today, certain events got me thinking about life insurance yet again, and I still have no answer. Is life insurance something that every person should get or is it just a "plus factor" that can make things easier in the future?