It seems like every where I go, when there's a topic about Animals, everyone in the topic seems to say stuff like "Animals are unimportant," and "Humans are more valuable than animals, who have no value." I also noticed a rise in animal cruelty and hunting. What's more, there seems to be more and more games, TV shows, etc with animals and animal lovers as badguys, and to make matters worse, animal hating people as the goodguys. Even Animal Planet these days rarely shows how facinating animals are in favor of how horrible they are. Have animals really become this worthless?