So I sit down, turn on the TV and its on animal planet. Its on some new show or something about people who catch and kill rattle snakes for their skin. Ignoreing the fact that it is wrong to hunt wild animals for anything that harms them (death in this case) it kills me to know a place like Animal Planet sponsors animal abuse for a profit. Maybe I'm being a tad over dramatic- but when I use to think of animal planet I thought of a company that sponsored things like animal rescues and animal rights, not just some sell out to make a profit.
What is the world coming to...
Btw just to state- it is diffrent when an animal is "farmed" and killed for money because that's what they were born for (as wrong as that may sound).