"Americans are notorious for their ignorance of global issues and international news. This may be because Americans aren’t interested or it may be that our news outlets feed us fluff and focus us only on the U.S."
Is the media only serving the consumers 'what they want?' Do the media have a responsibility to educate the populace and give us 'what we need?' Socially speaking, what about American culture is so focused on looking in the mirror and not looking out the window?