I was recently watching a movie about American Civil War (it wasn't very good though). Anyways, I was interested with the idea of states' rights in America. I know in the beginning American states were very indepenent and much of the problems of the founding of the nation had to do with the scope of the federal government. However, these days it seems as if the Federal government has gotten very much powerful versus before. Of course this trend as followed in many other countries (ahmm, EU).
What are your thoughts about states' rights? Shouldn't state governments have more power (similar to Swiss states for example)? Or is this an irreversible trend?
In a larger context, is it best for unification and larger power, or power vested in smaller, and more distinct regions?





Reply With Quote









