Has anyone ever wondered why parents and teachers spend a bunch of time telling their kids:
"Don't talk to strangers."
"Don't take candy from strangers."
"If an adult touches you, tell your parents."
"If a strange adult is watching you, tell your parents."
Then every year when Christmas time rolls around, parents take their kids to the mall to see Santa:
A stranger that knows when you're sleeping and when you're awake.
Who asks you if you've been a bad girl/boy.
Who wants you to sit on his lap.
Who wants you to tell him your secret desires.
Who gives you candy.
AND who wants a picture of you on his lap.
This to me just seems strange... I mean all year adult teach kids these fundamental rules of survival and then Christmas comes along and parents (the very people children are supposed to tell if anything bad happens) give their children to this strange man and watch as he whispers and holds them and smiles...
Imagine how traumatizing this is for the kids who really listened to their parents all year?
Here's some pics of a few kids who got the point and ask asking themselves "why mommy, why are you making me do this?!"