cawacko
Well-known member
Trade agreements, while typically touted as good for the economy and therefore employment, have had the effect of pitting American workers against competitors in low wage countries, thereby driving wages down in the U.S.
http://www.cbsnews.com/news/report-new-trade-pact-will-hurt-us-wages/
So you essentially believe all that is sold in the U.S. should be made in America. Of course that will bring higher costs to U.S. consumers but I'm assuming you believe that to an acceptable trade-off?
What will happen to U.S. businesses and jobs when we stop exporting things? If we aren't going to trade with other countries they aren't going to buy from us. Is that going to help our economy and country grow?