The Korean War basically isn’t taught in public schools in America that I’ve seen. America was propping up military dictators in the South, it didn’t have need to go to war with it because it basically received control of the south from Japan in the aftermath of WWII.
The Korean War basically isn’t taught in public schools in America that I’ve seen. America was propping up military dictators in the South, it didn’t have need to go to war with it because it basically received control of the south from Japan in the aftermath of WWII.