When and why did the mainstream media, so-called journalists and news agencies stop caring about the truth?  When and why did most stop asking the hard questions?  It seems the majority now have an agenda and/or try to protect special interests, rather than report the real news.  Americans want the FACTS and America needs them in order to thrive. 

When did the tolerance of lies become acceptable?  It’s not — And it’s time we Americans start demanding unbiased news from those who act like they’re reporting it.  We mustn’t let the important, tough questions go unanswered.  America needs honest, fearless reporters (media outlets) who will consistently fight for what’s right — Otherwise what’s the point, it’s poor entertainment.  What’s happening now is a facade of journalism that’s bad for America.

“Accountability Experts” NEEDED = A New Media that asks ALL questions necessary to report important, straightforward news.  The first order of business will be an internal investigation to learn WHY our current media doesn’t care about the truth.  The second order of business will deal with uncovering political corruption.

Note:  There are more Americans than Politicians, which is hopeful, but the “media” has control over what’s assimilated through the airwaves, which is kinda sketchy.