I have the opportunity to take a position in a Christian store. I shop there on a regular basis, but as I have become aware of the erosion in the teachings of many Churches, I feel very torn about accepting the job. I don't feel right about working in a store that calls itself Christian and yet promotes books like "The Shack," and sells "The Oprah Winfrey Bible."
Am I over thinking it. I understand that people need to make their own decisions about what they will read, but I also feel there is a tremendous responsibility on the retailer not to lead people astray. Since the store is a chain, these decisions are made by someone else, but does that really matter?
I would appreciate your feedback.