Sketches: The LRG Blog

A Short Reflection on the Evidentiality of #Evidence

Eidence sometimes suffers from a peculiar problem.  It is not always evident.  I call this the problem of the evidentiality of evidence.

By definition, evidence is that which in its appearance facilitates the same for something else.

There is at times circularity with such efforts — for instance, proof.  What is proof, however, but offering evidence?

Sometimes evidence doesn’t appear even though it stands before us.  Its appearance, in other words, requires understanding it as what it is.  A pair of shoes in front of a door of a house where people take off their shoes when entering doesn’t mean anything among a set of other shoes.  A missing pair in the pile informs members of the family of which one of them is not home.  A visitor, however, would notice nothing more than the etiquette of taking off one’s shoes when entering that family’s home.  Evidence, in other words, must be understood, and understanding it leads to its simultaneous appearance and realization of the absent thing, person, or event to which it points.  In phenomenological terms, one must be conscious of it.

An objection could be made that consciousness is not a necessary condition for evidence.  That which is evident is simply out there, in the world, waiting to be discovered, and is still there even if never found.  Conceding this doesn’t change the point, however.  Evidence never found is simply absent evidence.

The details cannot be spelled out in the small space afforded by this forum, but consciousness as articulated here is about a relationship with an object, which could also be formal or intended but not necessarily psychological.  An object of investigation here is evidence and the challenges of establishing a relationship with it.

Evidentiality involves the fundamental relationality of evidence.  Imagine a non-relational view of the world.  That would require ignoring conditions by which a phenomenon could appear or “be” in the first place, since one would not be related or connected to anything, including the most basic relation of identity — that is, related to oneself.  In philosophy, such a collapse is called solipsism — making oneself into all there is or, simply, the world.  The contradiction from eliminating all relations, including to the self, lead to disappearance of any basis from which to make distinctions, think thoughts, and do anything.  It leads proverbially nowhere.

Appresentation refers to what we perceive without its visibility.  For example, we are aware of the back of someone’s head while speaking to her face-to-face.  Or more intimately, our organs are appresented to us daily.  Evidence, in other words, brings to consciousness or any field of disclosure what must be, which requires connecting a series of missing phenomena, in effect, an ordering, or, in old-style philosophical language, logos (whose origins are from logging), which also points, inevitably, to a point of reference beyond the self.  Evidence, thus, requires intersubjectivity, a world of others, even with regard to the self — that is, the self taking on the perspective of another and also acknowledging its capacity to be another — and is therefore symbiotically linked to social reality.

Evidence is peculiarly social.  By social, I simply mean it is at its core that which must be communicable to others.  Where one has difficulty communicating evidence to others, one must ask how it was initially communicated to oneself.  Where one continues to see evidence as evidence, there is reputed communication of meaning.  This basic point extends not only to communicating with others in a shared language but also others across languages, where two possibilities emerge.  The first is the translation of evidence.  This requires additional acts of evidentiality such as determining isomorphic and shared terms in different languages.  The second is the communicability of the untranslatable.  Here that which must be understood to appear as evidence must be learned.  Again, for the sake of brevity, the basic fact that people from different societies speaking different languages do manage to learn untranslatable terms and expressions from other societies, as Kwasi Wiredu showed in his classic Cultural Universals and Particulars, is sufficient for the basic provision of communicated evidence.

My initial forays into what could be called evidence studies began in my research on mauvaise foi or bad faith, the phenomenon of lying to the self, in my book Bad Faith and Antiblack Racism.  A lie told to others is one thing.  It involves not only false statements but also the withholding of what may alert others about the speaker’s not telling the truth.  To some extent, a successful liar must not in her or his disposition slip up.  The liar must at least appear “sincere.”  This means to some extent convincing oneself of the lie as truth while telling it.  Many liars thus not only withhold evidence from others but also from themselves.  In effect, they must identify the evidence to be withheld and then refuse its evidentiality (appearance).  Such liars must then disarm the evidential force of evidence.  Understanding this brings forth the philosophical problem of bad faith, since it involves the liar and the lied-to being the same person or, as Jean-Paul Sartre formulated it, the unity of a single consciousness.

Critics may be perplexed to discover that good faith is a form of bad faith.  The observation about sincerity reveals the problem.  One could sincerely be in bad faith.  To be critical of that sincerity, however, requires bringing in an account of one’s position beyond oneself as a source of legitimacy.  In effect, one reconnects with a world that had to be put at bay in order to maintain such sincerity.  As bad faith is a flight from social reality, a return brings along with it the variety of public resources it offers.  One of them is evidence.

Before I continue, I would like to stress one thing.  In this context, bad faith is not a moral judgment.  It’s simply description of a capability or something people often do.  At times, it is so for good reason.  In moments of trauma, for instance, one may wish to avoid displeasing truths through taking refuge in pleasing falsehoods.  That said, let us return to its relationship to evidence.

Bad faith attempts to disarm evidence through appeals, at times, to non-persuasive evidence.

If it appears, then something else is already known or is evident.  Its sufficiency and necessity are one.  As there isn’t room to elaborate bad faith here, I’ll just close with a recent context in which the evidentiality of evidence is crucial.

Evidence often becomes problematic in the human sciences.  A particular field in which this takes place is research on race and racism.  Although I’ll focus on race and racism studies here, my preferred approach is multifaceted, where the embodiment of class, gender, race, sex, sexuality, and more are brought together and interrogated through human study.  As I often put it, I never see a race or gender or class or sex walking but instead a manifestation or functioning of all in which one is more emphasized or functions stronger than others at different moments, though all are always present.

The historical social pressures to avoid addressing race and racism took form also in their study.  Thus practitioners of the human sciences often attempted to avoid the taint of race and racism to the point of contradictorily delegitimating or denouncing the study of such phenomena while studying them.  The performative contradiction notwithstanding, other consequences include the confusion of problems faced by subjects of such study with such people being problems.  In the first instance, their problems appear where racism appears.  If, however, racism is denied but the problems appear, the trail of causes stops at the people themselves.  The problems and the people become one, and, as Franz Boas, Anna Julia Cooper, W.E.B. Du Bois, Frantz Fanon, and many others have shown, they become problems supposedly alleviated ultimately by the elimination of such people.  Race and racism therefore raise problems of rigor in the human sciences, as ultimately racial subjects are human ones.

In Disciplinary Decadence, I argue that bad faith disciplinary practices involve the fetishizing of method, where practitioners presume the completeness of their discipline and its methodological resources.  Treated as if created by a god, such methods need simply be applied with assurance of their outcomes.

Such practitioners reject what is offered from other disciplines basically on the grounds of its not emerging from or being their own.  Natural scientists who criticize social scientists for not being natural scientists is an example, and, as social constructivism reached beyond its scope attests, there are social scientists who reject natural scientists for not being social scientists.  These rejections are specific at disciplinary levels as well: biologists who reject cultural anthropology, historians who reject psychologists, literary scholars who reject other disciplines for not being “textual,” philosophers who reject nearly everyone else’s participation in theory, and the list goes on. Lost, however, is how such methods initially emerged.  Their suitability for a fragment of reality (a specific subject of inquiry) is not necessarily so for larger portions.  Refusal to admit this leads to the effort to squeeze reality into the discipline instead of adjusting the discipline to reality.  Turned inward as complete, the discipline collapses into a form of solipsism.  The portrait I offered of evading evidentiality returns.

I’ve argued that overcoming disciplinary decadence requires a teleological suspension of disciplinarity.  This is where a discipline is willing to go beyond its presuppositions for the sake of maintaining or re-establishing a relationship with reality.  I use the term “teleological suspension” in light of the Danish philosopher Kierkegaard’s teleological suspension of ethics.  The irony of strict adherence to morality is that it has sometimes led to unethical behavior.  Women and men of right could sometimes be very cruel.  It’s a good idea to reacquaint ourselves with why we do what we do.  This is an existential paradox brought to the level of disciplinary practice.  A practitioner at times must be willing to go beyond her discipline for the sake of reality, which may reinvigorate disciplinary integrity.  That leap of faith, so to speak, is often ironic, since instead of abandoning a discipline, it sometimes offers an expanded portrait of it.  One version of such is the communicability of a discipline and the very pragmatic outcome of using resources from other disciplines better suited for a particular problem at hand.  Anténor Firmin argued such in The Equality of the Human Races (1885) when he pointed out the problem of anthropology as this: there are so many elements manifested in what we call human beings that such study requires not only multiple disciplines but also their working together, communicating, to make evident what is often overlooked — namely, the unfolding of meaning as lived by a being of projects in variation.

Returning to the evidentiality of evidence, bad faith in the human sciences disarms the critical norms of evidence.  Criticality and evidentiality are intimately related.  The former has etymological origins in the ancient Greek verb krinein (to decide), from which emerged not only the nouns kritēs (judge) and kritērion (means or standard of judgment) but also krisis (crisis).  The link with evidence, whose Latin roots evidens means “obvious” or “apparent,” should be clear (that is, evident): good judgment involves making a decision based upon standards (criteria) whose appearance are compelling.  That the etymological origin of “critical” is shared with “crisis” is significant, as a critical situation is one over which a decision must be made, and a crisis is one in which a choice faced is also often one preferred avoided or deferred.  The classic choice not to choose is a performative contradiction rich in bad faith.  It requires, as Kierkegaard once formulated, failing to see what one sees.  With regard to the evidentiality of evidence, this means addressing the metacritical relation to evidence — namely, the admission of evidence as evidence.  The etymological thread boils down to the appearance of appearance.

At this point, reasoning demands exploration of the various fallacies often brought in the service of occluding evidence.  Such elaboration is beyond the scope of this forum.  For our purposes, however, the basic point should be obvious (evident), that a challenge posed by evidence is our willingness to respect it.  Human agency at the heart of our relationship to evidence ultimately comes down to the amount of reality many of us are willing to take.  This is not in and of itself pernicious, since, as finite beings, most of us could only accept reality in small doses.  All at once is overwhelming, and as no one can be everywhere, everyone must rely on what, by virtue of its presence, alerts us to what is absent.

Works Cited

Gordon, Lewis R. 1995. Bad Faith and Antiblack Racism. Amherst, NY: Humanity Books.

_____. 2006. Disciplinary Decadence: Living Thought in Trying Times. NY: Routledge.

_____. 2008. An Introduction to Africana Philosophy. Cambridge, UK: Cambridge University Press.

_____. 2016. “Disciplining as a Human Science”, Quaderna: A Multilingual and Transdisciplinary Journal, n° 3.

Husserl, Edmund. 1969. Formal and Transcendental Logic, trans. Dorion Cairns. The Hague: Martinus Nijhoff.

Kierkegaard, Søren. 1983. “Fear and Trembling” and “Repetition,” ed. and trans. with introduction and notes by Howard V. Hong and Edna H Hong. Princeton, NJ: Princeton University Press.

_____. 1998. Works of Love, trans. Howard V. Hong and Edna H. Hong. Princeton, NJ: Princeton University Press.

Sartre, Jean-Paul. 1956. Being and Nothingness: A Phenomenological Essay on Ontology, trans. Hazel V. Barnes. New York: Washington Square Press.

Schutz, Alfred. 1962. Collected Papers, vol. 1, The Problem of Social Reality. Ed. with an intro. by Maurice Natanson and a preface by H. L. Van Breda. The Hague: Martinus Nijhoff.

_____. 1970. The Phenomenology of the Social World. Translated by George Walsh and Frederick Lehnhert, with an introduction by George Walsh. Evanston: Northwestern University Press.

Wiredu, Kwasi. 1996. Cultural Universals and Particulars: An African Perspective. Bloomington: Indiana University Press.

Continues to Rise: Muhammad Ali (1942-2016)


Muhammad Ali’s life could be summed up in a single statement: freedom is always worth fighting for. As a professional pugilist, he inspired millions. As a political radical, he carried this conviction beyond the ring, fiercely denouncing racism and imperialism. But these two aspects of his life – the athlete and the militant – cannot be separated. His entire boxing career was fully political, and his greatest matches, against Ernie Terrell and George Foreman, saw him waging the struggle against white supremacy, racism, and collaborationism in the boxing ring itself.

Insights of a Warrior

His athletic achievements range from an Olympic gold medal in the light-heavyweight division in the Rome games of 1960 and becoming the world heavyweight champion three times with a repertoire of some of the most amazing matches in boxing history. He was so fast, creative, and tactical that he even influenced the great Bruce Lee, his noteworthy peer in Asian martial arts, world fame, and political commitments. Lee gave Ali the most sincere form of flattery by adding the latter’s style of footwork to Jeet Kune Do, his approach to Gung Fu. Legendary a boxer though he was, Ali will be remembered for the Promethean struggle he fought for dignity and respect not only as a man but also as one belonging to those despised by the country of his birth.

Ali fought, which means he also received his share of punches, despite floating like a butterfly and stinging like a bee (this signature-phrase was actually penned by his Afro-Jewish assistant trainer and corner man Drew Bundini Brown). He was one of a kind, though that didn’t mean there weren’t his analogues in other sites of struggle for the liberation of those under the heels of white supremacy, capitalism, and imperialism. I have already mentioned Bruce Lee, who, as an Asian American, no doubt appreciated Ali’s courageous statements of solidarity with East Asians during the U.S. war against Vietnam. In the struggle against Jim Crow, Malcolm X, his friend whom he had sadly later disavowed, stood for the same in words and deed in the realm of what Cornel West calls prophetic protest.

Yet, in terms of specific philosophical location and struggles in and beyond the ring, at least with regard to the basic question of standing up for what is right and the dignity it demands, his affinities were with the legendary, revolutionary philosopher psychiatrist Frantz Fanon. Unlike Ali, however, Fanon’s encounter with the realities of France, his nemesis-home, was not through an Olympic trial but that of the humiliation he suffered while fighting for France in World War II, from which he returned – like Ali who wouldn’t be served in a diner in his hometown – as a twice-decorated hero with continued, questioned status as a human being. Fanon eventually left France, fought for Algerian independence, served as a representative of the struggle throughout southern Africa, and left a powerful set of writings, all marked by the insights of a warrior, challenging us to fight for a healthy humanity. Though not a health professional, Ali shared Fanon’s diagnosis of the situation: better to be angry fighting for freedom than to be a “happy” slave.

What’s in a Name?

Born in Louisville, Kentucky, in 1942, he was the son of a sign-maker. The symbolism is evident. A sign always points to something other than itself, and, true to form, Ali kept questioning the world in which he lived. He never accepted the standard response to black subordination, exemplified by his father’s pointing to his skin color as the source of the obstacles his son faced. Joining critical Black thought from over the ages, he in effect responded that he wasn’t the problem – it was those who imposed such limitations on him.

Barriers, the precocious lad understood, should be torn down. Like many freedom fighters before him, he resolved to do so in a path from initial literacy to fists of resistance and then to political speech. Politics, after all, is about power, a relationship to which racist societies demand nothing beyond silence from those it dominates. Frederick Douglass, for instance, fought for his freedom first through learning to read, then matching fists with the slave-breaker Reverend Covey before moving to the North and then engaged in abolition activism in which his powers of speech were legendary.

Ali, who in his youth was Cassius Marcellus Clay, Jr., took a similar path through amateur boxing and then on to the Olympics and then professional boxing. His accolades early on included winning the Golden Glove. His determination throughout made it clear that something burned deep within him. He once remarked that he never started counting when doing sit-up exercises until after his abdomen began to hurt. Pain for him was a reminder of what he had to overcome. As I sometimes remind readers, it wasn’t liberation struggles that brought violence into Fanon’s life; as a colonial subject, he was born into violence. So was Ali, who was smart enough to understand that no physical blow matched those offered by the legal system, double-standard society, and constant violence of an ideology of continued degradation in print, the radio waves, cinema, and television. Those forces, even at the spiritual level, made their messages clear: the world was supposedly better without people like him, regardless of their achievement. He had a healthy response: there’s something wrong with that world, not the people it persecuted.

Changing that world meant for Ali a battle on inner as well as outer fronts. He already waged war on the outer, where he knocked down opponents of many kinds, including, to the chagrin of racist audiences, white ones. For the inner, he sought the counsel of the Nation of Islam, which led not only to his conversion but also his birth (for him, a form of being made whole by tearing asunder the effects of enslavement) as Muhammad Ali.

Interestingly enough, the “slave name” he discarded was in honor of Cassius Marcellus Clay (1810–1903), a white abolitionist who, among his many claims to fame, fought off assassins who had shot him point blank in the chest in one instance and a group that had stabbed him on another occasion. It was, along with Frederick Douglass, Clay who had insisted that President Lincoln issue a proclamation for the emancipation of enslaved people in the U.S. South. The reach of a sign is, we should remember, always beyond itself.

Everything about Muhammad Ali was poetic and thus symbolic. His movement from his disavowed slave name (despite its not being from an enslaver) to his anointed one (chosen by the Honorable Elijah Muhammad) is about transcending the soil: clay, after all, is an earthly permeable substance, and “Ali” is Arabic for high, or, as he correctly added, “most high.” “Muhammad” means “praiseworthy.” There is no doubt that Muhammad Ali’s life met the challenge of his name. I suspect as well that Clay would understand the importance of Ali’s choice: true freedom requires surpassing even those who fought for our emancipation.

Politics in the Ring

The question of Ali’s name occasioned what is no doubt his most remembered, symbolic fight. First, however, consider the proverbial lead up.

Ali was well known for his boasting and fiery rhetoric. What his critics didn’t realize is what many people of color who celebrated him across the world understood. The supposedly requisite need for white recognition is degrading. Ali refused to be patronized. Like Frantz Fanon and Malcolm X, whose words irritated and often frightened white audiences, Ali’s challenged antiblack racists who by definition rejected the idea that any person of African descent deserved respect. Even worse, the idea of publicly acknowledging his self-respect meant that his spirit was not crushed and his refusal to let such ever happen. His naysayers didn’t understand that Ali’s use of the pronoun “I” was never really singular in its designation. He knew they rejected him in his individuality, which meant his declaration spread across a people. He was announcing during the Civil Rights Struggle that Blacks were fighting for their right to exist and to flourish. That he won the heavyweight championship against Sonny Liston in 1964, the year of the Civil Rights Act outlawing discrimination on the basis of race, color, religion, sex, or national origin speaks for itself.

Ali’s jaunts and taunts were unforgiving, however, to those whom Malcolm X called “house Negroes” or “Uncle Toms.” Every racist society has some version of this figure. The French, for instance, have le Bon Nègre. Such figures were guided by a single creed: never, ever, upset whites. They no doubt represented for Ali the threat from within, which by extension applied not only to what he purged from his own soul but also what jeopardized liberation movements for all.

The World Boxing Association (WBA) had stripped Ali of his title when he joined the Nation of Islam (now The World Community of Al-Islam), which the Federal Bureau of Investigations had classified as a hate group and a threat to national security. The opening left Ernie Terrell as the WBA champion. The stage was set for Terrell to represent the House Negro who could please white masters by putting the upstart Ali in his supposed “place.” To make matters worse, the Louisville draft board reclassified Ali to make him eligible for the draft. His famous response, “I ain’t got nothing against no Viet Cong; no Viet Cong never called me nigger,” made him a hero among the downtrodden and those living in what was then called the  Third World, in addition to critics of the war, and a more intense object of white hatred. As the fight approached, Terrell kept referring to Ali by his disavowed slave name of Cassius Clay. Bear in mind that these events unfolded during 1966, when the Title IV proposing non-discrimination in housing was defeated in the U.S. Congress; the tides, in other words, were already turning against the gains from 1964. It was no small matter that his former friend, Malcolm X, was assassinated in 1965. State-sanctioned destruction of those who defied colonialism and racism was, as the expression goes, business as usual.

Ali and Terrell had their epic battle on the February 6, 1967. It was a brutal, fifteen-round fight in which Ali, upon landing each punch, added, “What’s my name, Uncle Tom . . . what’s my name?” To perhaps the judge’s, and most certainly the majority white audience’s, chagrin, the decision of Ali’s victory was unanimous.  

Ali and his name were victorious, but retaliation came in a familiar pattern as unleashed on those such as W.E.B. Du Bois and Paul Robeson before him; he was stripped again of his titles, with the addition of his boxing license and passport taken away. Unable to leave the country, he spent 1967 to 1970 appealing his conviction for draft evasion despite being a conscientious objector, while finding alternative means of earning an income. His license was reinstated in 1970 and his conviction overturned in 1971. His return to professional boxing led to some of the greatest showdowns, the most memorable of which, in athletic terms, were his loss and then victory against Joe Frazier. His last great, politically symbolic fight, however, was against George Foreman, against whom he used his famous “rope-a-dope” technique of absorbing punches until his opponent was tired out.

Foreman was an Olympic gold medalist at the 1968 Mexico games in which Tommie Smith and John Carlos made their historic, raised black-gloved covered fists of protest. Foreman countered their defiance by waving the U.S. flag at the moment of his victory. Though a much beloved celebrity today, what many people of color across the globe saw in 1968 was the return of the repulsive, subservient figure against whom liberationists such as Ali fought. Taking place in the then Republic of Zaire (now known as the Democratic Republic of the Congo), it was the event in which Ali reclaimed his title as heavyweight champion through defeating an opponent whom audiences of color saw as complicit in the domination of his fellow oppressed peoples. The victory symbolized Africa, and indeed the then Third World, fighting back.

The need to reassert white dominance never abandoned American popular culture. The 1976 film Rocky effectively tapped into the white supremacist dream of the Great White Hope through pitting Rocky Balboa (based on the white boxer Chuck Wepner, who in 1975 almost went fifteen rounds against Ali before losing by a knockout) against the Ali-inspired Apollo Creed. It is no surprise that in cinema, where fantasy rules, so, too, white supremacy found solace. Reviewing Rocky II in 1979 in conversation with critic Roger Ebert, Ali said: “For the black man to come out superior would be against America’s teachings. I have been so great in boxing they had to create an image like Rocky, a white image on the screen, to counteract my image in the ring. America has to have its white images, no matter where it gets them. Jesus, Wonder Woman, Tarzan and Rocky.”

After regaining the heavyweight title in 1974, Ali, at age 32, was already getting old for his profession. Subsequent defeat and retirement a decade later were inevitable, and in terms of his body, the onset of Parkinson disease led to a tragic struggle, with signs of dignity characteristic of the man, for the rest of his life. His two greatest weapons against his subordination, his physical prowess and his gift of speech, were compromised. Ali, however, was never defeated. One could imagine how many thoughts, how many moments of reflexive muscular poise, reminded him of limitations that made him seem his own prisoner. Yet, Ali never lost sight of what was ultimately greater than himself. His faith (which led to his taking the Hajj to Mecca/Makkah in 1972), after all, taught him that being the greatest among men never meant being greater than The Most High, the Greatest of the Greatest. His commitment, then, meant asserting perhaps his greatest virtue – his humanity. One could imagine how, freed from his affliction, he would have spoken in solidarity with #BlackLivesMatter, against Islamophobia, and for global solidarity against the many forms of degradation besetting the world today.

Ali’s remains return to Louisville on June 10. Though his death returns him to the soil (yes, to clay), we all know in our hearts that we remember him, Ali, because he, as poet Maya Angelou would remind us, continues to rise.

New Translation

Russian translation by Maria Kaplun of "Types of Academics and Other Kinds of Intellectuals": “Льюис Гордон. Типы ученых, гуманитариев и высшей интеллигенции,” ∑: Syg ma (04 февраля 2016):

On Star Wars: The Force Awakens or Daddy Issues Continued but...

This week’s blog entry is from my contribution to THE DAILY NOUS:


The danger with entertainment is our tendency of forgetting what it is: entertainment. As with dreams, where the imagination could play, wishes, desires, hopes could make themselves manifest where we would otherwise prefer they remain hidden. Thus, the layers of hiddenness and revelation that unfold in this four decade’s romp we call
Star Wars are so many that only few could be addressed in this forum.

Let me focus on the darkness of the Dark Father/Darth Vader whose hiddenness awaited confirmation as he was audio-visually “black” (through the voice of James Earl Jones), which made his unmasking (or is it unhelmeting?) such a disappointment of continued white revelation. Power, after all, is feared most as black and thus relieved of threat when the rather lipid and, unfortunately, redeemed father, was re-whitened. Such a disappointing fate for the greatest line, perhaps because of its psychoanalytical and theological significance, echoed across the reaches of cinema, mythic and historical: “I am your father….”

Freud, as we know, would have a proverbial field day, at least given his arguments in Moses and Monotheism.   Redemption could only be offered through a specific son’s willingness of taking on the sins of all the sons, symbolic and otherwise, for the death of the Dark Father, which makes the re-awakening a wonderful subversion of the original premise by making an ornamentally white warrior reveal a dark interior. The force, in this black storm trooper (Finn), speaks teasingly (great decoy of early footage of him holding a light saber) to larger forces at work (yes, pun intended) as the sins of the father, including Founding Fathers, come to bear on what they had suppressed across time in the white washing of history, mythic or otherwise. Perhaps in this awakening, Black Lives may matter a little more through at least one black life willing to walk under the light without white uniform and its uniformity.

Yet, what is going on here is more than a racial matter. The penchant for trinities, which the set up of this one promises for the overall portrait of nine episodes, raises the question of Father, Son, and Holy Ghost.  Patriarchy, we should remember, is ultimately a conflict between fathers—in fact, The Father—and sons, in which Christianity offered some resolution in terms of a special son killing the father and taking on the sins for the deed. Women, in this portrait, suffer as collateral damage. Thus, the twists and turns offered here, where Han Solo (need we stress the contradiction of a man with the name “Solo” trying to raise a family?) faces erasure through the call of the Dark Father as the only father to whom homage must be paid. There is no way out, as Luke (whose name, after all, means “light”) is also a son, which continues the dramatic Father-Son patriarchal relationship.

Resolution? While his sister Leia produced a son (Ben/Kylo Ren—his given name being redundant as “Ben” is Hebrew for “son”), Luke offers hope in terms of a daughter (Rey, whose name requires a slight change in spelling to become “ray,” as in “ray of light”), through whom a new relationship is raised: The Father-Daughter possibility. She disrupts this pathological Father-Son drama, and more: her heart reaches not for whiteness clothed in black but blackness once clothed in white but now, at least for a time, wearing the leather jacket signaling a bad Mutha…yes, watch your mouth.

All this may be too much for Geeks’ jouissance. Some, as we know, protested the unholy of unholies of antiblack and sexist societies: hetero-normative love between a black man and a white woman. Though tapping into the childish world of entertainment and play, this installment of the series, though retelling its humble beginnings on a desert planet, with all the biblical resonance with which we are familiar, offers more grown up reflection in a world that is willing to accept other species but shudder at the blackness within.

The star of this installment of Star Wars is Daisy Ridley (Rey) as far as I’m concerned. I look forward to seeing the extent to which her character may be able to disrupt The Force—including the Son-Father who is now training her—in this primordial battle between power that destroys and power through which life flourishes.

Types of Academics and Other Kinds of Intellectuals

with Enrique and students in Mexico City 2015

I recently had a wonderful breakfast conversation with Enrique Dussel and three of his former students after keynoting the Philosophy of the City conference in Mexico City. Our conversation took a turn to my thoughts on the types of academics I’ve observed over the years. The discussion occasioned much interest, and Enrique asked me to send him a letter outlining my observations so he could place it on his website. This is what I wrote:

5 December 2015

Dear Enrique, Luis, Bernardo, and Jorge,

It was so good having breakfast with you today. As promised, here is a summary of my typography of professional intellectual groups with a focus on academics, particularly philosophers and scientists. I apologize for writing in English. Doing so in Spanish would take me a long time, and you asked me to send it as soon as possible.

My experience in the academy, the arts, and other intellectual communities has led me to conclude that there are basically three types, with mixtures and alliances across the lines with serious epistemological, historical, professional, and political consequences. They are basically those who are (1) smart/skilled, (2) those who are with one or two original ideas, and (3) those who are overflowing with creativity/originality.

Let’s use the academic as the focus for the sake of argument. No one becomes an academic without being considered “smart.” Thus, at first, all anyone who enters this world knows is that she or he is basically smart and can ascend through being considered “smarter” than those who weren’t able to do so. At a certain point, everyone consists of those who distinguished themselves in secondary school, then at university, eventually to the masters and then the doctorate. Along the way, many others fall to the wayside.

Then there are those who distinguish themselves by securing academic employment. Eventually, there are those who become tenured and achieve the coveted rank of full professor.

At this point, all we have are “smart” or “skilled” people—in short, “academics.” They have met the criteria for each stage.

Now, if we go back to the point of initial employment, the only thing every employed academic in a society that understands reward in terms of employment knows is that she or he at least belongs to (1). Let us simply call this category the ones, and the others the twos and the threes. Correlatively, we could also say the firsts, the seconds, and the thirds.

The only way to know where anyone ultimately stands is through the work she or he produces. The world, as we know, has many people who have been called “smart,” “brilliant,” and even “geniuses” who have produced no work to validate such claims. In the end, it’s simply a judgment made by many who are impressed by things such as standardized examinations, witty conversations, or projected intellectual investments.

The ones or the firsts simply produce, over their entire body of work (writings, research, performance, whatever “production” model we seek), “smart” or highly “skilled” exemplars. The best among them are simply known as the smart getting smarter to achieve the status of the smartest or most skilled. Notice, however, that the ones don’t always have to produce, as their using their intelligence to secure their employment is also treated as evidence of their smartness. Thus, simply acquiring a prestigious appointment deems members of the ones such even if they produce nothing afterward.

The twos are those who have one or two original ideas. All among them are basically smart. Some are smarter. Some are even among the smartest with the addition of the one or two original ideas they develop.

The threes should at this point be obvious. They are burgeoning with originality. They are creative and a constant source of new ideas. They, too, began as basically smart. Some are smarter. Sometimes, they are even among the so-called smartest. But more than often, the energy devoted to creativity leaves less time to focus on technique and other operational concerns (such as acquiring institutional power) that tend to be features of people who are smart without imagination.

In a psychologically and sociologically healthy environment, each sees her or himself as part of a community of people producing knowledge (or other expressions of the intellect and imagination, such as art) for the good of all. The goal of such a community is the flourishing of ideas for the welfare of humanity and ideally all life and even nonliving things—in short, all reality. There is in such environments an attunement with reality and ideas as ultimately greater in purpose and value than the self. In that world, the onescontribution is using their smartness, whether through their own research or the institutions they manage, to facilitate and make better the original work of the twos and the threes. And the twos and the threes draw on the techniques and precision of the ones for a beautiful marriage of imagination, evidential connections with reality, and the communicability of each in the form of content and technique.

The healthy environment is not, however, the norm. Other factors such as market rewards and vanity come into play. With aspirations of glory in the form of academic prestige or fame, the ones use their smartness to convince the rest of the world that their (the ones’) work exemplifies the best that humanity can achieve, which by extension suggests that they are the best exemplifications of what academics could hope to become. Unfortunately, this involves suppressing the appearance of the twos and the threes. Worse, sometimes the ones would conspire with members of the twos for the elimination of the threes. So, together, the ones and the twos offer themselves as the best for which the academy could hope.

The twos in this environment vary, according to whether they exemplify a mixture of ones and twos and also those other factors of vanity and professional prestige. If the latter, then the threes are in big trouble.

The threes in truth are so captivated by ideas and their work that they have little time for the manipulation of power to which the ones and some of the twos may dedicate some (at times all) of their energy. The threes are thus the most vulnerable in this academic unhealthy environment. It is rare to find one among them who is cognizant of the nefarious forces encircling them and with the smarts to know what to do before it is too late.

At this point, Enrique, you pretty much recognize many of the players in your more than sixty years’ participation in the academy, as we discussed this morning. I was particularly struck—Luis, Bernardo, and Jorges—that you can already recognize this dynamic so early in your careers. They are pretty clear in many academic philosophical departments and associations, as you already know.

It is without question that the ones dominate professional philosophy. In fact, analytical philosophy, which is hegemonic in Australia, Canada, South Africa (and many former African colonies), the United Kingdom, the United States, and growing across Latin America (no doubt through foundational and continued counter- intelligence funding), could be called the philosophy of the ones. The coveted ascription in that branch of philosophy is to be called “smart.” And even better: “really smart.”

There is an equivalent in Euro-continental philosophy similarly across the countries I just mentioned with the addition of those in continental Europe. Instead of formal logical skill, they offer textual techniques of “readings” and historical knowledge (though there are many with preparation in formal logic).

The enemy of many members (I stress not all) of both the analytical and Euro-continentalists, however, is the threes, and the ones often invite the twos to work with them as the best model: an academy spending much energy into achieving at best one or two original ideas. The threes are in fact beyond all this stuff, which makes them often “guilty” of the offense of being “undisciplined.” If one looks at the history of philosophy, the players and the trends should be obvious. The rub, of course, is that over time, despite the efforts of the ones, posterity belongs to the threes, except in the healthy environment, where all three groups are recognized for a collective achievement such as an intellectual movement or, as in technological science, a single event such as a human being setting foot on the moon.

Amusingly, some sociologists and historians of philosophy misrepresent the field through making themselves—often exemplars of the ones and the twos—models of what to seek in the past. And the Eurocentrism often leads to the misrepresentation of the past in that regard. Think of how much we learn about in fact minor Hellenic philosophers versus a creative giant in Northeast Africa such as Imhotep who preceded them by 2,000 years!

There was a time in which analytical philosophy could produce the threes (as attested to by Russell, Whitehead, Wittgenstein, Austin, Carnap, Tarski, Ruth Barcan Marcus, and so forth), but that is long gone. What has been touted as “great” achievements since the 1970s are so pretty much through dominant narratives of the ones, colluding with some of the twos, who offer themselves as the best that could possibly be achieved. They do so through the colonization of journals, publishing houses, universities, and foundations and use that power to block the emergence of as many threes as possible. I can’t think of a single analytical philosopher since the 1960s who could seriously be considered in her or his achievements to be more than a member of the twos.

Euro-continental philosophers have done a similar thing, but they do so less in terms of technique but in terms of identity and location—namely, legitimation through Eurocentrism often in the form of “interpretation” or “hermeneutics” as appeal to “tradition.” Much of this is well known in the conservatism and fascism of Heidegger, Gadamer, and Croce, elements of which were paradoxically in Arendt in terms of her Deutschophilia and Anglo-Americophilia. This is balanced by alternatives, as there are forms of vitalism, phenomenology, existentialism, structuralism, and even poststructuralism, that are not exclusively “continental” in my view, as the work of Bergson, Husserl, Jaspers, Cassirer, Sartre, Beauvoir, Merleau-Ponty, Weil, Gasset, Foucault, and Derrida attest as among the most outstanding, and they, at least, addressed the world. Unfortunately, at times, the Euro-continentalists collude with the analytical philosophers, as we see in some members of recent Frankfurt School critical theory, in a common cause of excluding creativity, especially from philosophers who belong to the periphery of the Global North. For my part, I find only four members of the Frankfurt School, Marcuse and Fromm from the earlier period and Jürgen Habermas and Karl-Otto Apel from the later, really worth my time.

This unfortunate portrait also emerges among some who espouse the pragmatist tradition. Beyond the classic period of Peirce, James, Dewey, C.I. Lewis with occasional inclusions of Du Bois and Alain Locke, Richard Rorty is touted as the Great White Hope, when in fact his achievements amount to a strong member of the ones at worst and a solid member of the twos at best. In the end, they ultimately support legitimating practices that amount to legitimate intellectual work as that which is manifested by the Euro-north.

I wrote much about these issues in a different way in my book Decadencia disciplinaria: Pensamiento vivo en tiempos difíciles, published in Quito-Ecuador in 2013 and in a Chiapas, México edition of 2014. The original English edition, Disciplinary Decadence: Living thought in Trying Times, was published in the states in 2006. I described the circumstances that occasion these three types as disciplinary decadence because the practitioners turn away from reality through investment in the professional and epistemologically limited rewards of methodological fetishism. Living disciplines reach for the world and grow. When they turn inward and become obsessed with themselves as if they are reality, they are in fact dying while under the illusion of being alive, which is why I call them the living dead. It really is a zombification of thought. I argued for transcending all that through what I call a teleological suspension of disciplinarity . This involves being willing to go beyond one’s discipline and its received method for the sake of reality. In philosophy, I describe this as the paradox of philosophy beyond philosophy.

Regarding the many disciplines, I encouraged disciplinary communication—a task more difficult and radical than interdisciplinarity—in the form of transdisciplinarity, which is a level of communication through which new disciplines more attuned to reality could emerge and in their turn may face eventual transcendence. As you already know, many of us have taken up such a task in our pursuit of knowledge across the global south through freeing ourselves of the dialectics of recognition. Producing the work supervenes over quests for recognition.

Strikingly, the greatest of the greats, so to speak, are threes who throughout the ages always addressed humanity. In terms of those from the Global South who became ancestors in the twentieth century, Sri Aurobindo, Joseph Auguste Anténor Firmin, W.E.B. Du Bois, Anna Julia Cooper, C.L.R. James, Frantz Fanon, Steve Bantu Biko, Ali Shariati, Keiji Nishitani, and Abdul-Rahman Badawi are some that come to mind.

Although I’m mentioning philosophy here, one could apply it, as you correctly pointed out in our discussion, to nearly any given discipline or field. As this schema hasn’t existed as an object of study, I cannot offer empirical data but instead an educated guess, given my going on thirty years in the profession and my historical research on the emergence of intellectual movements, their associations, and networks. I expect the ones to be about ninety-three percent of academics, the twos to be about six, and the threes to be about one percent and, in some fields, even less—that is a fraction of one percent. One could easily see why the ones would be very attractive to a market-oriented system of knowledge and its rewards. Their raison d’être is mastering a system of recognition and its rewards.

One thing I would like to stress, again and again, is that no one initially knows where she or he stands in these categories. It is the work that matters, and the committed among us cares most about that and its telos or purpose. The problem is that for too many it’s about their egos. They would like advanced knowledge about where they stand, and even if that is not enough. If they could forecast their location, many would prefer to misrepresent the outcomes so they could appear as twos or threes who are also ones. Yes, they want it all.

The historical example of Robert Hooke, Edmund Halley, and Isaac Newton, which I had recounted in our breakfast conversation, illustrates my point.

Recall Hooke was the head of the Royal Society of London for Improving of Natural Knowledge. Through the study of wooden corks, he discovered the cell, which was a monumental achievement in biology. His inventions and studies of micro-reality, as documented in his meticulous drawings in his Micrographia, were indeed groundbreaking. Halley was an astronomer and geographer, whose observations, mapping of the sky and oceans of the earth, and techniques of measurement led to his membership in the august society. One day, Halley, Hooke, and some colleagues were at a pub engaging in scientific reflection when a discussion of the planets’ orbits emerged. Recall that Copernicus ushered in the heliocentric view with the planets revolving around the sun. Why such precision in their orbits? Why not an equidistant circle instead of an elliptical orbit? The conversation took a turn in the form of a wager to see who could find the solution to the problem, which Halley (other accounts say Hooke) formulated in terms of Kepler’s third law of multiple orbiting objects. The game was proverbially “on.”

After months of effort, none of them was close to a solution. Halley was not so much concerned with winning the bet, however. He really wanted to know the solution. So he sought counsel among other scientists who eventually recommended his consulting a cantankerous young man over at Cambridge who was busying himself with, among other interests such as mathematics and optics, alchemy and theology. That young man, Isaac Newton, was from modest means (a farmer’s son). He had in some accounts hoped to achieve wealth through turning bronze into gold with the assistance of some good grace from his beloved Christ.

Things didn’t start well when Halley met with Newton. When he explained to the latter that he was working on a challenge between Hooke and other colleagues, Newton immediately told him to get out. He hated Hooke, who had some years earlier placed Newton in ill repute over his theory of optics, white light, and its status as a wave or a particle. As Hooke was the head of the Royal Society and a specialist in many areas, including optics, Hooke’s challenge had much weight. Hooke, Newton insisted, was a charlatan and a crook and he would have nothing to do with an enterprise of which Hooke was a part. So, Halley headed back to London in his horse-drawn coach.

Newton, however, had a sudden change of heart. He realized that if Hooke were to find someone who produced the solution, he would simply claim it and continue enjoying his undeserved status of being the best scientific mind in Britain. So, Newton placed his energy into developing a solution and quickly fetched a messenger to deliver it to Halley in London. When Halley arrived home, the letter was waiting for him. Reading it, Halley was stunned and immediately went back to speak with Newton at Cambridge. Newton had invented the infinitesimal calculus and used its formulations to explain the gravitational forces through which the planets were kept in their orbit. (In Germany, Gottfried Leibniz had also invented the infinitesimal calculus, but it is Newton’s formulation that is more known, and the latter had not known about this development in the work of the former.)

This idea was so radically new that Halley insisted it be published in the form of a treatise. So, he went to the Royal Society to seek funds for its publication. Unfortunately, the society had no money. It had spent its funds on books categorizing fish. Yes, fish. See, for example, Francis Willughby’s De Historia Piscium. Not to be deterred, Halley placed himself into debt and worked with Newton on fine-tuning the communicability of his ideas for publication.

At the point at which the book was to be printed, a problem arose. Halley explained to Newton: Hooke claimed the theory was his, and he wanted that to be acknowledged in the preface. Newton was outraged and insisted he would prefer the book be burned than published with an acknowledgment to Hooke. Halley offered another solution. He gathered the members of the Royal Society together in a meeting to discuss Hooke’s demand for acknowledgment. At the meeting, he asked Hooke to explain the theory to the rest of the society, as it is such an important development, he wanted to make sure they understood what they will be celebrating. Hooke, however, insisted that they should simply take him at his word that it was his theory based on the formulations from the initial wager. Halley, however, pointed out that they couldn’t do so unless they knew the theory, which required its explanation. Hooke couldn’t offer the explanation, which led Halley to demand his admission that he had not developed the theory. The treatise, Newton’s Philosophiae Naturalis Principia Mathematica, was then published without any acknowledgment to Hooke. And, as we all know, a scientific classic—indeed, a revolutionary work, in many historians of sciences opinion the greatest work—was born.

There are different versions of this story. In some, Hooke’s charge of plagiarism against Newton was after the first edition came to print and that the events leading to his being expunged from the text was from the second edition onward. Other accounts placed Halley in a minor role. What is clear is that Halley was playing mediator in all this.

It also turned out that Halley was up to much more with the famous wager. He had a hypothesis about comets of doom in the historical record. Using Newton’s equations, he was able to defend his hypothesis that it was the same comet by predicting its next appearance. His successful prediction led to its namesake: Halley’s comet.

We see the three groups in this tale, specifics here and there notwithstanding. The ones were the producers of those books about fish. Their correlates today are the gatekeepers of the many academic journals in which the goal is simply to demonstrate professional expertise or skill or take advantage, by way of financial support, of the given system. Hooke, monumental though his contribution to cellular biology was, belongs to the twos in the overall scheme of things, though in microbiology he was clearly among the threes. Newton is obviously a member of the threes. And Halley was within his fields of astronomy mapping and geography clearly the threes and overall a member of the twos or somewhere between twos and threes, though I’m sure there will be debate as comparing biology and astronomy is much like doing so with apples and oranges. These categories are, after all, fluid.

Notice also how Hooke’s psychological motivations mirror so well what dominates the academic world today. Halley, however, appears to be the hero of this story, or at least the possibly apocryphal account I’ve offered. After all, he worked through the entire theory with Newton but selflessly left all the glory to Newton, who, by the way, claimed to have developed the theory years earlier. Halley could have easily colluded with Hooke and produced a purportedly co-authored version at Newton’s expense. He was, however, more concerned with finding the solution to the problem, and in doing so played an extraordinary role in one of the great achievements of our species. After all, had the wager not been made, had Halley not visited Newton, the young mathematician and optical scientist would have continued devoting his intellectual energy to alchemy and theology (which, oddly enough, he returned to later in life). We should also bear in mind that Newton’s initial efforts on an impossible task might have prepared him well for an achievable one. We should also add that, as we know with relativity theory and quantum mechanics, Newton wasn’t the last word, but he was a giant condition for the possibility of work through to this day.

Halley had also faced many obstacles in the course of his career. His former mentor John Flamsteed had become jealous of his achievements and did everything in his power to prevent his securing a post at Oxford. Halley’s employment included work for the government as Master of Mint through the aid of Newton during his period in parliament. He obtained a post many years later, to Flamsteed's chagrin, and eventually replaced Flamsteed in the most coveted chair in astronomy there.

Enrique, I think that our students and you could easily see where many philosophers in the past and present are located in this schema. A sad feature of our present, however, is that the market colonization of knowledge and intellectuals, which I wrote about in truthout some years ago, has fortified the grip of the ones. We are in an age hostile to creativity and reality. As that stranglehold tightens, certain areas of philosophy will become even more hostile environments for the production of great ideas.

And, as you rightly pointed out, this applies to the arts and many other areas of social life. We already know how the commercialization of music has enabled unimaginative recordings and performers to dominate, with occasional appearances of artists with one or two moments of genuine creativity. And truly creative artists are often the proverbial starving or little known ones. This is also connected to a serious error made by the Frankfurt School (a collusion of ones and twos) in their very racist attacks on jazz and other forms of Afro-modern music. They literally set up the musical equivalent of ones in the form of Euro-classical music as the model. What they failed to see is that each art form has within it these categories. Thus, Bach, Mozart, and Beethoven were threes in a sea of some twos and mostly ones. In jazz, one could think of Armstrong, Waller, Ellington, Strayhorn, Parker, Monk, Mingus, and Coltrane (all threes), in a world of musicians who were twos (very excellent and creative musicians) and ones (those of fine technique who simply imitate or play standards perfectly).  We could think similarly in rock ‘n’ roll, rhythm and blues, reggae, soul, salsa, samba, and even now in hip hop, where similar forces come into play.

I think there is much we could learn from the Halley model, in the form characterized here, a story that was repeated in other areas such as philosophy and the arts. There are great artists who understood they needed to create conditions for even greater art to emerge. I see Miles Davis (though his personality left much to be desired) and Art Blakey to be of this kind as they mentored and created opportunities for so many great musicians. The same could be said among researchers and scholars. Devotion to the greater project—producing what ultimately belongs to humanity—requires creating opportunities.

I think that’s a problem (among so many) in the current situation of hegemonic knowledge in the Global North. The ones, who are reducing the conditions by which threes could emerge, dominate and are growing beyond the estimated ninety-three percent. The problem with threes, after all, is that they are pretty much like the Jewish story of the Messiah: She or he could be the homeless beggar or prostitute or gangsta youth—in short, not immediately seen as what she or he is. If we think about the creativity it takes to survive in the world of illicit economies—conditions marked by so many dangers but there for populations who must engage such because of catastrophic unemployment and violent policing of borders—what, we may ask, might or could such produce with increased access to the material conditions for the production of knowledge?

Past communities of knowledge offered some openings for those who are forced to devote so much energy to alchemy and other challenges of the imagination. Today, however, only cracks remain, and they are getting sealed.

It’s crucial, then, for us to put on the table the building of alternative institutions for the production of knowledge and learning. Even if the ones, the twos, and the threes are the inevitable groups to emerge, as we see, a world in which they work together for a cause greater than themselves is one in which so many, if not all, of us will benefit.

Thanks again, Luis, for organizing our breakfast discussion. And, yes, you have my permission to share this letter and post it on your website, as, given my argument, such reflections are best suited for public reflection and debate.

Irie tov y en solidaridad, Lewis

Elijah Gordon’s Bar Mitzvah Speech

A year and a half ago I posted reflections on my daughter Sula Gordon’s speech on her becoming a Bat Mitzvah: I here offer my son Elijah Gordon’s speech. I continue to be moved by the image of a child carrying so immense a responsibility as Torah. It is marvelous as it is not about the physics of the matter. It’s about placing such expectations on children, which, as we all know, many will be able to bear, while others could be crushed by even the thought. Our Rabbi who officiated Sula’s was recently injured and could not fly to the event. The circumstance created an opportunity to listen, so to speak, and so we did. We decided on a Mitzvah not often discussed. Elijah’s became the first Bar Mitzvah ceremony officiated by the only out African American Rabbi from the LGBT community, Sandra Lawson. As it turned out, the weekend was the culmination of Gay Pride week in the Hartford and West Hartford area. It’s an example of what we call a Baruch HaShem moment. Sometimes things happen for the right reasons. Mazel tov, Eliyahu and Sandra! Here is what our new Bar Mitzvah had to say.



First, I want to thank my family, Sandy Freedman for preparing me for this special day, Sandra Lawson for being the officiating Rabbi for this ceremony, thanks to Capers Funnaye for having been my Rabbi for over the past decade, and all of you, my relatives and friends, who have joined me today. Rabbi Funnye had an accident that prevents him from traveling, so he is not here today. We send him Torah blessings for his recovery. Sandy Freedman worked really hard with me. She is a dedicated, inspiring teacher.    

I also want to give thanks to those who are no longer alive but who I wish were here at this eventful day.  Thanks to my late grandmother Yvonne Patricia Solomon-Garel, my late-step-grandfather Jack Garel, our wonderful family friend Gary Tobin, who co-founded Be’chol Lashon, my paternal grand-Aunt Thelma Chong-Young, my uncle Lewis Samuel Gordon (known as “Tafari”), and my maternal grand-cousin David Levy.

My Torah portion, Shelach Lecha, although specifically about spiritual spies, could also be read as a story about how some people see only the negative sides of a situation and others see its good potential. This is also a reflection of important debates in many Jewish communities. Some Jews only see the bad side of things, while others see good in them. My family is an unusual Jewish family because we belong to at least three lines of Judaism: Ashkenazi, Mizrahi, and Sephardic. (For those of you who are not Jewish, Ashkenazi mostly means European Jews, Mizrahi means East African and Middle Eastern, and Sephardic means North African and Mediterranean Jews.) There are other kinds of Jews, but these three are the ones in my family. Some people might say there is something strange or even wrong about my family not being only one type of Jews. This is because they have a problem with mixture.

My family is not only religiously mixed as Jews and also non-Jews. My family is also mixed in other ways.   Through my mother, my family has people who are German, Lithuanian, and South African. Through my father, we are very mixed. He calls us “Jewmaicans.” We are Jamaicans who are of East and West African, Chinese, Cuban, Irish, Palestinian/Israeili, Panamanian, Scottish, and Tamil Indian descent. Some people think diversity and mixedness means being “impure” and “not real.” They think being a person like me means I don’t properly “belong” to any group.

I, however, think there is no such thing as anyone being truly “pure.” Second, instead of not belonging, my experience with my relatives is of always belonging. Third, I know what each of these groups is like from the inside. It is like my Torah portion because I can learn among each group, even from those who don’t always know I am a member of another group. Fourth, visiting my relatives means I travel around the world and learn what people are like in many places. And finally, the people who only think negatively forget that people all around the world are human beings. I get to see and experience our common humanity.

There are others who share my view of mixtures as strengths. My Jewish education includes two special organizations: The Friday School in Providence, Rhode Island, and Be’chol Lashon in San Francisco, California. They are inspiring examples of “strengths”: They bring together Jews from many backgrounds and they ask us to think about how to make Judaism more inclusive. With their help, I grew up knowing the true diversity of Jewish people.

Ironically, I also learned about being a Jew through some special people who are not Jewish. First, there is Mr. Patterson from St. Paul’s School. He taught many life lessons on how you should treat people you don’t know.  And he also loves awful puns, which makes him like my maternal Jewish grandfather. It’s a combo plate.  Second, there is Ms. Higg who was my first-, fourth-, and fifth-grade teacher. She encouraged me to aim at growing into a good person. Third, there is Uncle Paget, who is really my godfather, and he is so close to us that we call him “Uncle.” Then there are the Tucker boys: Eric and Will. They understand love, family and community. And then there are Matthew Kos and Joshua Bruneau. They insist on working hard at our talents to see what life offers. All these people support love, family, community, and building a better world. These, as we know, are important themes of Torah, of Judaism.

A Torah portion is an important part of Jewish learning. My Jewish learning began with my Mom and Dad.  My Mom taught me many Jewish prayers and helped me greatly with my Hebrew. My Dad has taught me the history of Judaism from a very young age and its importance for our family. Both my parents taught me that Judaism is not about some old bearded man bullying us through life. It is about taking seriously two sides of the meaning of G-d. One side is about there being a world in which we live. The other side is about taking responsibility for that world. It’s about ethics. It’s about at least trying to be the best we can be.

Thank you,

Eliyahu Shlomo

On Divergent


So I went to see Divergent. I figured my effort to keep up with the popular cultural lives of my students and children require my going through the full gambit of these coming of age movies after taking the plunge and now being a survivor of the Twilight series. These films, which include The Hunger Games and its sequel, emerge, formulaically, from novels premised upon appealing to a special demographic: adolescent girls. I saw the first installment of the Twilight Saga with my eldest daughter when she was twelve; the first of The Hunger Games with my youngest daughter, when she, too, was twelve. And Divergent? Well, my wife and daughters wouldn’t go, so I went along with my youngest son, who, yes, is twelve.

On the surface, Divergent had all the elements to insult one’s intelligence: an adolescent girl who is not like anyone else (remember in Twilight that no vampire was able to read Bella’s mind) and who, caught up in a world of adults who ultimately act like children, defies a system premised upon foretold divisions or, as it is in this case, “factions.” Whether as races, species, or factions, it’s pretty much the same story in these novels and film adaptations. A hot but at first untrustworthy or “dangerous” guy comes into the picture, and sublimated sexual attraction gets the ball rolling until consummation, after much built up desire mixed begrudgingly with forbearance, occurs—usually at the point of overcoming the powerful, extraordinarily and arrogantly smart enemy.

Divergent is not disappointing in these regard. There is the adolescent or almost adult girl (sixteen years old—what a sweet sixteen, no?) who doesn’t fit in with the system, and it is ultimately through her that disruption occurs. There is the hot guy, who appears at first dangerous. There is the forced division of society. And there is the set of race and gender clichés of the immediate friendship with the cool black girl (interestingly played by Zoë Kravitz—which is why I kept thinking of Lisa Bonet—and who creates subtextual reference across fictional universes as her dad Lenny Kravitz was in Hunger Games), which, in American race logic, means no major presence of black males (except, as in Twilight 2: New Moon, a black male rapist who is decapitated, which also means symbolically castrated, and in Hunger Games 2: Catching Fire, where Kravitz’s character is brutally killed by officers in white). The two Kravitzes add an additional, albeit unintended, dimension of in-betweenness; as Afro-Jews, they raise multiple challenges to these predominantly white and Christological universes. Returning to gender, however, there is a black male character in Divergent who is completely complicit with the wrongdoing at hand. That black male is patently not marked, or rendered likeable, by mixture.

There are as well several other layers that make the story line move from the sophomoric to the unexpectedly perceptive and, dare I say, even brilliant.

I haven’t read the novel, so I won’t get into the author's, Veronica Roth’s, intent. In terms of the film, there is what looks like a sophomoric reading and critique of Plato’s Republic. In this post-apocalyptic future Chicago is a city-state, surrounded by walls and a large electrical fence, in which social division is premised on five official “factions” representing: abnegation, for the selfless; amity, for the peaceful; candor, for the honest; dauntless, for the brave; and erudite, for the Intelligent. There are technically two others that pose problems for this effort at balance and utopia: divergent, for those who manifest all five traits, and factionless, for those who belong to none.

The governing group is the Abnegates, which makes sense since they are “selfless.” (The purpose of government, as many seem to forget today, is service.) This already begs the question of intelligence, since, apparently, selflessness is not a mark of intelligence. This governing body eschews vanity (rarely seeing themselves because of their preferring to live without, or with very few, mirrors) and greed (living in homes befitting at best a second-world existence). The Erudites are sharply-dressed mixtures of postmodern cool and STEM (science, technology, engineering, and math) types living in sharply contrasted environments of the whitest white and the blackest black, and, as we ultimately learn, they are greedy for power to the point of fascist zeal: in other words, they’re ultimately right wing. The legal system is left to Candor (truth-telling lawyers—really?), and the cultivation of crops to Amity. The Factionless, homeless and depending on the kindness of strangers (aye, those in need of welfare), look like zombies.

The critique of Plato’s Republic is already evident in having Abnegates rule. That villainy comes from the Erudites (the academicians, ultimately) reveals the flaw of having philosopher kings and queens. (These philosophers are, of course, premised on scientists and technicians, whose intrinsic capacity to lead, I should stress, existential humanists reject.) And the warriors, who are mostly adrenalin-pumped jocks, pretty much follow whoever manipulates them. Limited of mind, they easily succumb to mind-altering and controlling drugs.

So, there is already much for me to hate politically about this film. Anti-intellectualism would support right-wing rhetoric in the US these days, and, of course, there is no critique of what is ultimately a military state.   Basic training proves to be rather essential, however. These stories, after all, tap into a fundamental fantasy of all adolescents: they wish to be able to defend themselves and protect their loved ones. This is radically so for adolescent girls, who for millennia have been offered protection through well-muscled arms of males under whose protection is also the danger of force that could easily be turned otherwise. While bad boys could beat off other boys, who is to beat them off if they remember that they are, after all, really bad?

There is also a psychoanalytical dimension, which, too, is part of the recent adolescent-girl-saves-the-world craze. The beautiful Kate Winslet as Jeanine Matthews, the villainous, slick-looking fascist leader of the Erudites ruthlessly plotting to eliminate the Abnegates engaged in combat against the also beautiful young Shailine Woodley (whose magnificent performance in The Descendants so impressed me that her playing the Divergent protagonist Beatrice Prior is another reason I went to see this film—great acting is sometimes the salvation of so many cinematic stinkers), with the underlying theme facing one’s reflection in the mirror, reminds us of no less than the fairytale “Snow White.” As the older woman is attempting to birth a society in her own image, the theme of narcissism and standing in the way of the coming-of-age young woman is familiar stuff.   

Yet, I really like this movie.

An irritating dimension of many of the other exemplars of this genre is that the heroine is often flat and her gifts are inexplicable. The parents are losers, and the viewer has to endure watching a world run by children ultimately because there are no real adults around. This criticism doesn’t apply to Divergent.  The Divergents make sense once one considers the fundamental flaw of the whole silly system and premise of the film. (For logicians reading this, one should remember the power of the empty set: everything it generates is valid.) At sixteen, everyone learns through a psychological test—where situations are induced through a mixture of drugs and Matrix-style virtual reality—the faction to which she or he belongs. But in the ceremony, one must still choose the faction to which they would prefer to belong. For those who choose beyond their faction, this means facing an uphill climb to belong. If they fail, they cannot return and cannot stay, and thus become factionless.   As some do succeed, it follows that this experiment in eugenics already has people from other factions mixing with each other to produce in effect “mixed” children. These children could inherit the weaknesses of their parents, but they could also embody their strengths. Over time, it logically follows that people would be born with the strengths of all factions. Divergents, then, are inevitable outcomes of this system.

The result is a tale of what could be called virtue ethics. Now, as the Catholic Scottish philosopher Alisdair MacIntyre pointed out well in his book After Virtue, the transition from the Greek notion of arête (and I would add the Egyptian/Km.tian concept of ma’at) to what today we call virtue was not seamless. The Medieval Latin concept of virtu, which connects more to female virginity, created some confusion with notions of sexual purity. This element is there in these series of adolescent-girl-saves-the-world films, since each, albeit for the most part horny, is also virtuous in the Christian sense of pure of heart (and, apparently, the rest of her body).    Yet what is good about the protagonist in (and thankfully also The Hunger Games) is that the ancient concept of excellence comes to the fore and in effect liberates virtue from virtu. In other words, as a convergence of excellent attributes, mythically marked as being selfless, peaceful, honest, courageous, and intelligent, there is also the addition of a special kind of existential attribute—namely, the courage to face the contingent, to take the risk or leap of faith (and there is actual leaping in this film) into the unknown.

Courage is an underlying theme that taps into the psychology of the vicious or those marked by vice: governed by fear, which disgusts Triss, such people ultimately do bad things. Complicating the matter, however, and what makes this portrait of ethics and politics insightful, is that there are Erudites who go bad. Not governed by fear and drunk with knowledge, what motivates their infelicity?

We stand now in the terrain of what ancient Greeks call ἀκρασία (akrasia): knowingly and therefore willingly committing evil. What complicates the matter is that the term actually means “without power” or “without command.” In other words, not controlling oneself. This makes the knowledge one has insufficient for the course of one’s action. Something else must be at work. Now, philosophers from the time of Socrates to the present have disagreed about the psychology and anthropology at work behind this notion. Isn’t there some epistemic content, some knowing, at the heart of willing? If one cannot control oneself, how is it possible when it is oneself who must refuse control? While not really being in control (weakness of will) may offer some solace for those who would like moral education to prevail (in other words, as Fats Waller, paraphrasing Jesus of Nazareth, used to say, “They didn’t know any betta”), the real and troubling question is, What if they did? Do we really believe that the Erudite Jeannine doesn’t know that she is organizing mass murder and is not willingly taking responsibility for that action? Is she seriously suffering from a weakness of will?

It is this kind of reflection that makes both the story and its cinematic adaptation special. These are stories and films that, after all, break many of the rules of what used to be offered to the young—even when accompanied by a parent. It is not simply that they are violent films, cinematic presentations in which people are tortured and killed. They are also films in which children and adolescents commit these actions. Beyond the questions of targeting the fantasies of young women and men—sexual and otherwise, at least with regard to physical strength and agility—there is the broader social and global question of why adult situations are being placed in the context of children's literature and film.

I offer this hypothesis. We are living in times of radical political impotence. The erosion of political outlets, where communities resolve differences through the resources of speech and actions premised on commitment to a greater good, has led to disastrous consequences in our age of neoliberalism and neoconservatism. Though sounding fancy, these terms refer to radical privatization of institutions premised on social welfare and isolation of individual existence paradoxically through the valorization of individual rights. As Jean-Paul Sartre showed in his Critique of Dialectical Reason, all the valorization of individualism achieves is the assurance of individuals who depend on themselves for protection: in other words a separate but equally collective of vulnerable people. Corporations and private groups are, after all, groups, and as such they could crush individuals. And they do. The neoconservative end seeks order, and although there is anti-governmentalism here and there, the order sought is replete with militarism and appeals to traditions manifesting very little short of fear of the future. The problem for young people, however, is that they are the future.

Thus, the result is radicalized insecurity. Why valorize the powerful, individual body? It’s because of lost faith in a social world through which one could reach beyond one’s body. Sheer physical strength, matched by individual wits (and, of course beauty, a blessing from the gods), affords some chance of seeing another day.

We live in a world of radical changes happening in the midst of a desperate effort for permanent control among the powerful. But the truth is that reality exceeds our widest grasp, and turning away from time won’t change having to face tomorrow. As young people are bullied away from political outlets, from sources of action that would enable them to build the future in which ultimately they must live, what might these stories be telling the older generation, even where erudite and beautiful, other than that those among us who fail to understand that our responsibility is to contribute to a future in which we must ultimately step to the side are no less than a drag on history and obstacles to the future?

A meditation on virtue amidst dystopia, even with a big Hollywood budget, is an appreciable start. Bringing together the young novelists insight, even if subconscious, the protagonist’s chosen name reveals much in this regard. Beatrice (from Latin beatricim, which means “bringer of joy”) interestingly enough, is reborn through an act of naming—which, as students of myth know is an assertion of agency through interpellation—when she becomes “Triss,” a member of Dauntless, though, paradoxically, there is also deception as she is a Divergent. It is paradoxical because her divergence makes her dauntless. She is, in other words, dauntlessly beyond dauntless. As “Triss” is the Latin diminutive of “Beatrice,” her chosen new name is actually a hint of rebirth. This is truly mythic stuff: the goddess becomes flesh and goes through a process of trials and tribulations to be reborn. To push the mythic elements further, as a bringer of joy, she is ideal for sacrifice, for that which is to be sacrificed is that which one does not want to lose. If she were a bringer of pain and suffering, her parents losing her would be a relief. So already at the mythic level, the proverbial die is cast. A society that forces its youth never to grow up is imperiled. It sacrifices its young.

Yes, meditation on virtue facing dystopia is a good start.

Harriet Beecher Stowe


Harriet Beecher Stowe died in 1896, the year in which the Supreme Court sanctified Jim Crow and thereby undermining the path of freedom for which so many and she fought. Her niece, Katharine Seymour Day, who fought to preserve her legacy through securing the foundations for the Harriet Beecher Stowe Center, died in 1964, the year of the Civil Rights Act that emerged out of the March on Washington, DC, in 1963.

There are so many unusual things to learn through spending an afternoon at the Harriet Beecher Stowe Center.  Among my favorite was to discover her close relationship with Sojourner Truth.  (She kept in her living room a statue of a beautiful African woman because it reminded her of Truth.)

Other treasures include the wonderful exhibit on the history of Uncle Tom’s Cabin at the main center’s office, material on her knowledge of apothecary, others on her activist work for issues ranging from suffrage for women to environmental causes as well as animal rights and organic farming, and the list goes on.   Harriet Beecher Stowe was truly one of the great geniuses of the 19th century, an example of a creative human being who brought innovation to everything she touched.

If you're ever in Hartford, do visit this important tribute to this extraordinary woman and her community who gave so much to the cause of human dignity and the struggle for freedom:


12-years for blog

Congratulations to the cast, crew, producers, and director of 12 Years a Slave.

Perhaps it is because of the increase in global inequalities and the global sense of insecurity. Whatever the cause: reflections on slavery are in the air.

That makes now an opportune time for a film such as 12 Years a Slave. Some of my colleagues, students, and I viewed this important film when it was released last fall. The effect is simply gut-wrenching. As the closing credits unfolded, we remained, stunned, in our seats.

Based on Solomon Northup’s account of being kidnapped while working in Washington, DC, and sold into slavery in the pre-Civil War South for a dozen years, this film offered a portrayal marked by the obvious demand of historical accuracy.

Few historians dare speak about the details of chattel slavery in the United States, and, for the most part, the Americas. Based in the 1840s, the account is after the British outlawed the Trans-Atlantic Slave Trade. This led to home production of enslaved people, and, given the demand exceeding the supply, a rise in the price of the enslaved. That American slavery was racialized meant, however, that freed blacks were vulnerable to being snatched into the clutches of those governing that brutal institution.

A powerful element of the film is its laying bare the reality of what it means to live in a world where one’s humanity is rejected. It didn’t matter how much Northup, portrayed by the brilliant Chiwetel Ejiofor, protested his situation when even mentioning his real name endangered his life. Even with all the brutality involved in not only enslaving him but also making him into a slave—to wipe out of him the humanity to which he clung with drowning desperation—the absurdity and cruelty of making people into property were evident. There are so many scenes of degradation and brutality that were features of everyday life in the slave states that there is no room to mention them all here. The ethical struggles were evident in such scenes as Northup’s attempting to flee in woods with which he was unfamiliar, only to encounter a group of white men lynching three black men. There was the reminder that virtually any white person had the status of a police officer. To be black out in the open was to be a crime, punishable by dismemberment or death.

And then there is the plight of enslaved women: Lupita Nyong'o’s extraordinary, Oscar-winning performance of Patsy, who faced the lust of the master, the cruelty of his wife, and the absurdity behind her being whipped nearly to death because of her effort to regain some dignity through procuring a bar of soap.

Yes, the question in this historical drama is obvious: how could this be the world was a short time ago? The answer? Who says it’s no longer so? As long as people could convince themselves they are free of sin and that others are not really people, the proverbial rest follows.

Brad Pitt, who produced this film, also appears in it as the agent who delivers the letter that sets the course of Northup’s liberation from the sadistic situation in which he was captive. Oddly enough, in producing this film, Brad Pitt is also the means by which the historic Solomon Northup’s letter is delivered to our audiences. Pitt has received a lot of criticism for appearing as one of the few good white characters in the entire film. Whether as the historic character or our historical present, I think the greater point is missing here: This is a message that needs to be delivered.

That Northup’s efforts to seek justice against his persecutors were in vain during his lifetime, the rest of which was also dedicated to abolitionism, should remind us, especially as we look at today’s slavery, whether in brothels, high-society homes, mines, or overcrowded prisons, the fight for human dignity continues. See 12 Years as Slave. Talk about it. And, without hesitation, join the fight for what is nothing short of right.



Eu, junto com milhões, talvez até bilhões, acendemos uma vela no dia 5 de Dezembro de 2013 em memória a Nelson Rolihlahla Mandela, Madiba or Tata, como ele também é carinhosamente conhecido na língua Xhosa de sua terra natal, Azania, conhecida agora através de seu nome colonial e pós- apartheid, África do Sul.

A luz da vela tem muitos significados em muitas sociedades. A luz significa revelar um caminho para uma nova jornada. Para os vivos, a luz brilha sobre nós uma forma de conexão contínua, revelando-nos algo sobre o qual devemos refletir. E, para aqueles profundamente religiosos, algo que deve ser deixado através de sua própria trajetória. Isso nos lembra, conforme a oração de luto do Judaísmo, o Kaddish, que tudo é finalmente deixado nas mãos de Deus.

Mandela morreu da forma como viveu. Sua vida foi um paradoxo entre a paz e a violência, lutando contra o ódio com coragem e amor. Ele morreu de uma maneira saudável, enfrentando a doença corajosamente com status incomum de um ex-funcionário de um país Africano, cuja grandeza moral fez dele um líder perpétuo. Enquanto enfrentava violência e sofrimento, ao longo de sua vida, morreu naquela que é a metáfora certa para o que cultivou: a paz.

Muitos adjetivos podem ser citados para oferecer uma compreensão do que este grande homem representou. Talvez, dois adjetivos sejam mais adequados no momento: a coragem e a dignidade.

Possivelmente, os 27 anos como um prisioneiro político sobre o infame Robben Island poderiam ter sido evitados se ele não tivesse insistido em uma libertação incondicional. Sua grandeza, a luta encampada para libertação e o grito de guerra de sua missão humanitária, mantiveram-se como um lembrete para aqueles que olham para os africanos e, levianamente, tentam pensar o contrário: as forças do colonialismo, misantropia e do racismo sempre estiveram equivocadas e continuam estando. Mandela levantou-se em oposição a isso e ousou declarar: “Nós somos seres humanos.”

Muitos se recusaram a ouvir, mas os rumos da história foram contra o apartheid. Devemos lembrar que o sistema de segregação criado pelo governo independente Sul-Africano de 1948 até 1994 abarcava um conjunto de instituições copiado dos Estados Unidos. A luta tomou muitas formas, vários protestos civis, insurreição, e um eventual estratagema econômico de desinvestimento aleijaram a economia desse regime racista. Mas o mundo também trouxe, ao longo de gerações, como a juventude em Londres, Inglaterra, o poder da música, com o sucesso de 1984, “Libertem Nelson Mandela,” escrita por Jerry Dammers e interpretada por The AKA Especial. A música tornou-se um hino da luta antiapartheid e, ofereceu, no final, o que muitas pessoas continuam a querer por trás da maioria das lutas de libertação: um Messias.

A luta antiapartheid revelou muitos revolucionários, como Steven Bantu Biko (o principal teórico da Consciência Negra) e Chris Hani (líder do Partido Comunista Sul-Africano). O primeiro foi assassinado em 1977, o segundo, em 1993. Muitas coisas se desenrolaram a partir de 1994, quando Mandela se tornou presidente , de que Biko e Hani não as teriam aprovado. Mandela se juntou a eles agora como um ancestral, mas o seu lugar na memória histórica traz uma palavra adicional como foco, um mais palatável ao mundo político que se revelou, talvez seja, uma perigosa armadilha de paradoxo, como podemos ver em um Barack Obama, que talvez não pudesse ter sido, senão para a primazia de Mandela: liderança moral.

Sim, a África do Sul era uma imitação dos Estados Unidos, e, em seguida, a criança tornou-se pai, quando os EUA recentemente ecoaram da África do Sul nas eleições presidenciais de Obama, nenhuma questão aborda as falhas morais de ambos os países mais do que seu passado e presente racistas. Ironia é tentar salvar esses países da incorporação de seu maior medo, ou seja, a representação negra. No entanto, uma figura assim, não poderia emergir como uma representação negra. O que significou mais um paradoxo como vemos na África do Sul de hoje e nos Estados Unidos: Messias são exceções por definição, não regras. Os prêmios por si só não poderiam ser o modelo de um homem ou mulher todos os dias:

Prêmio Nobel da Paz, Bharat Ratna, Personalidade da Time do Ano, Prêmio Sakharov, Medalha Presidencial da Liberdade:

Medalha Congressional de Ouro, Prêmio Arthur Ashe Courage, Medalha Jubileu de Diamante da Rainha Elizabeth II, Prêmio da Paz Gandhi, Medalha da Liberdade Filadélfia, Prêmio Jawaharlal Nehru pelo Entendimento Internacional, Prêmio da Paz Lênin, Medalha Jubileu de outro Rainha Elizabeth II, Nishan - e- Pakistan, Prêmio Internacional Al- Kadafi para Direitos Humanos, Prêmio Embaixador da Consciência, Prêmio Internacional Simón Bolívar, Prêmio das Nações Unidas no domínio dos Direitos Humanos, Ordem do Nilo, Prêmio Cidadania Mundial, Prêmio U Thant da Paz, Prêmio Félix Houphouët- Boigny da Paz, Medalha Isitwalandwe, Prêmio Indira Gandhi de Justiça e Harmonia Internacional, Liberdade da Cidade de Aberdeen , Prêmio Bruno Kreisky Award , Prêmio da Paz da UNESCO, Prêmio Carter- Menil de Direitos Humanos, Prêmio de distinção humanitária Bispo John T. Walker, Medalha Giuseppe Motta, Prêmio Internacional de Direitos Humanos  Ludovic - Tra rieux, Prêmio para o Entendimento Internacional  J. William Fulbright, Medalha Internacional W E B DuBois, Prêmio Príncipe da Astúrias para a Cooperação Internacional, Prêmio do ano da Harvard Business School Statesman. 

A lista de Obama não é muito diferente, e inclui muitos prêmios que agora levam seu nome.

No entanto, novamente, a exceção não é, por definição, a regra. Pode-se amar Mandela e Obama, enquanto continua-se a odiar os negros. Enquanto a vida simbólica dos mais altos cargos mudou, a vida mundana da maioria das pessoas de todas as raças permanece a mesma.

Um dos travestis da agressão contra a humanidade que marcou o mundo moderno é que a moral dos homens poderia supervisionar o mais cruel dos regimes. No entanto, não poderíamos deixar de insistir no ridículo. E, se esses grandes homens, por conseguinte, tivessem tentado ser imorais? O que poderíamos dizer sobre um mundo que se fez ser ético, que é ainda maior do que moral? Seria a maneira mais certa de parecer como um tolo? Pessoas morais nem sempre são éticas. O primeiro a seguir as regras sempre tenta fazer o que é certo. Mas as pessoas éticas, por vezes, parecem imorais. Elas, em geral, são pessoas corajosas que sofrem muito com um mundo que pode feri-las, golpeando-as, a partir de uma imperfeição óbvia, marcada pela coragem, de quebrar as regras.

O mundo quer Messias. Contudo, Deus continua enviando-nos seres humanos. Temos sorte, no entanto, que alguns deles acabam por ser um pouco mais que realmente tinham imaginado.

Tenho escrito muito sobre Frantz Fanon, o psiquiatra revolucionário famoso e filósofo da libertação, que morreu no dia 6 de dezembro de 1961. Mandela era 7 anos mais velho que Fanon e sobreviveu um curto dia de 42 anos. Fanon enfrentou violência, mas morreu de pneumonia devido a complicações da leucemia. Embora aparentemente aleatório, é estranho que estes dois grandes homens morreram a partir do que se resume a infecções de seus pulmões. Nossos pulmões, no entanto, permitir-nos respirar e a consciência mítica nos lembra que eles motivam o sopro da vida. As ações desses grandes homens foram como o sopro de vida às nações para as quais eles lutaram. Com a morte deles, seus filhos e a nação enfrentam o lembrete assustador: ninguém vive para sempre.

A sabedoria de Mandela foi servir um mandato como presidente da África do Sul. A razão política e filosófica foi classicamente fanoniana: consciente dos problemas Moisés, onde aqueles que lideraram o caminho para a Terra Prometida também são os mais capazes de colocar em risco, Mandela decidiu criar, por exemplo, um caminho alternativo do que aconteceu em muitos outros estados pós-coloniais, onde, depois de se livrar dos colonizadores, os libertadores tornaram-se os maiores obstáculos para a verdadeira liberdade.

No entanto, penso que este grande homem também tinha uma consideração adicional em mente. Mandela entendia-se como uma ideia. A grandiosidade do seu pensamento representou muito mais do que sua própria imagem física. Enquanto modelo inspirador, também era perigoso porque a vida política exige possibilidade. Se o obstáculo é difícil, não há nada que podemos fazer para ultrapassá-lo? Que padrão mais elevado poderia haver do que se tornar um deus?

A decisão de Mandela para servir um mandato foi também, como grande parte de sua vida, um paradoxo. Pisando para o lado, deixando espaço para os outros, ele ironicamente definiu um padrão ainda mais elevado: a humildade, cujo amor é a paciência e, a fé democrática. Dessa forma, ele estabeleceu um padrão de possibilidade humana.

Então, ao contemplar o brilho da chama quando se eventualmente apaga, eu digo, na apreciação compartilhada por muitos:

Obrigado, Nelson Rolihlahla Mandela, suas ações inspiram muitos de nós a se imaginar alto e, ao mesmo tempo, nos lembram de que você foi, antes de tudo, um ser humano, com muitas limitações personificadas, as quais tornam a esperança, o amor e possibilidade tão preciosos.

Adeus, Madiba. Adeus.


Traduzido por Rosemere Ferreira da Silva

© Lewis R. Gordon