Is war inevitable to humankind forever?
Is there a historical "necessity" to take up arms and kill each other?
Okay, let me explain my questions before you stigmatize me. I'm neither one of those (insert political ideology here) ever-complimenting pacifists, who oppose all wars, nor one of those raging (insert political ideology here) hawks who demand bloodshed for every possible cause. (I'm neither naive, I'm just interested in YOUR answers, TW fans.)
Studying history one could assume that war is inevitable. War is born in the minds, and as long as their is a slight notion of violence in the minds, there will be war.
But is war really hard-wired in our psyche? Can't we get rid of it, like we got rid of slavery? (BTW, did we get rid of slavery truly?)
Why do we have to fight, if there are other possible solutions to our problems?
edit: August 14, attached a poll to the thread. Vote and comment. (The "My Rifle" quote is from Riflemen's Creed.)





Reply With Quote












