Yes, the Bible should be taught in our schools because it is necessary to understand the Bible if we are to truly understand our own culture and how it came to be. The Bible has influenced every part of western culture from our art, music, and history, to our sense of fairness, charity, and business.


Get Social with TBU

Follow The Behaviour University in order to get the greatest quotes from the greatest people of all time so that you can tap into your own greatness.

Follow Us:

The Behaviour University ©