News
At the outbreak of World War One, the United States of America decided not to involve itself in what it saw as a European conflict. However, some major events brought an end to this neutral position.
It was on April 6 in 1917, that the United States abandoned its stance of neutrality and officially joined the Allied powers ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results