I haven't thought about this in a while, but I was watching something on tv the other day with flashbacks from the 60s showing white people carrying signs saying "Back to Africa" on them. For the life of me, I've NEVER understood that whole back to Africa "campaign" if you will. My whole thing is, aren't white people the ones who brought Africans to America in the first place? So how are you gonna be mad about us being here if it's your fault we're here? The Africans were minding their own business when Mr. Slave Master decided to show up and rip them out of their home. Did they ask to come here? Nope. Not even a little bit.
Then, generations later we have Mr. Slave Master's descendants telling Black people to go back to Africa. I'm pretty sure that my entire family of Black folks was born right here in America, as was I, so how are we gonna go back to a place we've never been? If you have a problem with me being here, blame your ancestors. It's like, us being here was okay for all those years you were getting over on us and forcing us into free labor but once we got our freedom our presence was suddenly a problem. Oh well, not our fault so get over it.
Anyway, that's all I really have to say on the subject. It's just something that always bothered me because if you really think about it, it makes no sense.
That's all for now....