RACISM IN THE UNITED STATES
Racism in the United States consists of negative attitudes and views on race or ethnicity that are related to one another, are held by various people and groups in the United States, and have manifested themselves in discriminatory laws, practices, and actions (including violence) against racial or ethnic groups at various times in the country's history. White Americans have generally benefited from legally or socially sanctioned privileges and rights throughout American history, which have been denied to members of various ethnic or minority groups at various times. European Americans, especially affluent white Anglo-Saxon Protestants, are said to have had educational advantages. Immigration, voting rights, citizenship, land acquisition, and criminal procedure are all issues that need to be addressed. Since the colonial period, racism has existed in the United States against various ethnic or minority groups. Throughout much of American history, African Americans, in particular, ...