TV series have always had politics injected to them but they played a minor role or part of the show back in the day. You can easily watch and not think anything of it. These days, it seems most shows NEED to be spreading a message of social politics and it has put me off big time. I hardly watch anything anymore. Do you feel TV shows these days are too socially political?