Dr King understood that drawing a line in the sand against imperialism was necessary to bring about the radical changes he wanted to see in the United States.
American exceptionalism is a white exceptionalism that portrays the U.S. as a society rooted in democracy rather than slavery, liberty rather than brutality and genocide.