I was talking with a good friend the other day and I was blasting off about people pulling the race card (particularly in regards to Barry Bonds). She told me that she believes that, while not all white people are racist, that white society as a whole is inherently racist. She also told me that if any minority does not like a white person, that that does not constitute racism. I think it's bull****, but I wanted to hear other opinions. . . So, is white society inherently racist? Yes or no? To whom? And why do you think that? Is there such a thing as black on white racism or are white people just stupid? Go.