The joy of the godless (parte the firste)

I was recently accused by a commenter of being the wrong kind of atheist:

There is a difference between the honest atheism of the nihilist, who believes there really is no God and acknowledges the implications of such, and the self-delusionary humanism of the New Atheist, who does not really mean what he says when he says ‘there is no God’ but instead believes ‘there is a God and I am he.’  And by that I mean that he thinks he is the highest form of life there is — the noblest and most dignified Being there is (which gives him the ability — no, it’s more than that — the right, to determine that ‘all humans have equal value.’)

Apparently I am deluding myself because I’m just not sad enough. In order to be an ‘honest atheist’, I have to be a nihilist, recognizing nothing but abject sorrow and emptiness within the meaningless void of a random, uncaring universe. Otherwise I am exalting myself to heights of self-aggrandizing hauteur, imagining myself to be the single highest life form in existence.

Calling this a straw man or a caricature would be lowballing the audacity of this ridiculous lie almost to the point of being completely inaccurate in my labeling. Nothing in that paragraph, however well it may be written, describes anything that comes anywhere close to my personal beliefs. It is an argument that is the intellectual equivalent of drawing a moustache and goofy glasses on the portrait of a political opponent (I already wear glasses and have facial hair, so perhaps a better comparison is needed).

However, mulling this over in my mind did yield some fertile personal exploration about how I arrived by my atheism, and why I am not an abject nihilist. I am, save for occasional bouts of depression when reading news articles or following politics, an incredibly happy person. Ludicrously happy, in fact. At this particular moment in my life I am employed at a job I love and find challenging, am living in the city of my choosing surrounded by interesting, supportive, and (let’s face it) attractive friends. I have personal, musical, and political projects that occupy my free hours, and there are many more things out there for me to learn and explore.

It was not always so for me, this type of fulfilled contentment. There was once a time when I was in the throes of deep existential conflict – when I struggled day and night with questions that underlay the whole of my self-identity. I read voraciously, trying to find how other thinkers had addressed these problems in the past. These sojourns into the philosophical literature occasionally yielded a few weeks or months of respite, but inevitably I would find myself foundering once again on a sea of doubt and confusion.

I was raised Roman Catholic, and beginning in my late childhood I began taking my religion very seriously. Coming from a far more liberal family than average, my religious beliefs were not scripture-based, but rather ran along lines of a code of decency, generosity, humility, and above all, forgiveness. When good things happened, I would immediately thank and praise God. When bad things happened, I comforted myself in the understanding that there was ultimate justice awaiting all people. I was happy to reconcile my scientific understanding of the universe with the bits of the Bible I had read, glossing over the parts that didn’t make sense. I was actually voted valedictorian of my confirmation class (like a Bar Mitzvah for Catholics), and asked to give a speech on our religious journey. I planned to become a priest and share my insights into the loving God with congregations of faithful believers.

But, as it says in First Corinthians:

When I was a child, I used to speak like a child, think like a child, reason like a child; when I became a man, I did away with childish things.

I began to see that the religion I belonged to in no way reflected my own beliefs. Our youth group received newsletters from anti-choice organizations filled with lies and distortions of facts. When I wrote to them demanding that they show some accountability, my letters were dismissed and ignored. I began to struggle with the hypocrisy and vulgar pomposity of the Church; idolatry on full display, hate passed off as divinely justified, a seeming abdication of the custodianship of humanity that was preached from the pulpit. It seemed as though the idea of a loving, forgiving and just God was put to the lie by the hate, insolence and moral emptiness of those who claimed His favour.

So I began to read: Kierkegaard, Nietzsche, Daniel Quinn, Ayn Rand, Dostoyevsky, Hugo, Dickens, Terry Goodkind… anything I could get my hands on. At one particularly desperate point I attempted to read though the Bible, hoping to wrest some insight from its pages – sadly, the Bible is just the oral history of a bronze age tribe set in florid language; not particularly helpful. Thinking that my constant crisis of faith was due to laziness on my part, I redoubled my commitment to the church – reading from the lectern at mass, teaching Sunday school, playing viola with the choir. My father was of little help during this period, giving me pat answers to complex questions and becoming upset that I would even ask (even though it was he who taught me to question authority, a lesson I’m sure he regrets imparting now). I would pray every day that God would grant me some kind of solution to my constant queries, or that He would at least help me by silencing the voice in my head that kept pointing out the gaping flaws in my patchwork theology.

No help from above was forthcoming. I entered a long and bitter period in which I clung to the ribbons of my faith like a vagrant clings to the rags on his back, snarling angrily at anyone who would question me from either side. Believers were simple-minded fools who hadn’t asked the important questions, whereas atheists were simply denying the manifest truth of the majesty of the universe and the wonders of faith.

I can say without hyperbole that those intervening years were some of the most miserable of my life. Most assuredly, being an eccentric, chubby, racially outlying teenager probably contributed more than its fair share to my unhappiness. However, even in my private moments of introspective reflection, I could not escape the constant nagging doubt – a doubt that was a gaping hole in my entire outlook on life.

So when people rhapsodize to me about the joys of religious life, and the great comfort they find in their loving relationship with YahwAlladdha, it’s hard for me not to hearken back to those years when I reached with all my mind, body and soul for some measure of that comfort and fell repeatedly on my face. The only time when I was free from the torment was during the brief windows of time in which I was able to slap a band-aid explanation or trite bit of theology over a serious question and ignore it for a while.

Of course now I am much happier, and am no longer plagued with such angst, but I am well over my post-length limit, so I will have to save that part for next Monday.

TL/DR: I have not always been an atheist, but my religious faith (when I had it) was a constant source of trouble and pain for me. Far from making me a nihilist, my atheism has made me far happier than I have ever been as a believer.

It don’t matter if you’re not black or white

Because we live in Canada, and because so much of the way we see ourselves is inextricably tied up with the United States, we tend to see racism issues as black and white. I don’t mean this in a philosophical dichotomy way, I mean that we tend to focus on race and racism as a black people issue and a white people issue. At the presentation I gave on October 1st, I realized after the fact that the majority of my examples of race and racism are about black people in opposition to white people. Many of my examples from the blog are about black issues in the context of the white majority.

The reality, however, is that race issues go way beyond black and white. We live in a country (and, particularly, I live in a city) that is made up of a number of different groups with distinct cultural histories that are interacting in a unique way. Each group has its own issues to resolve with every other, and the majority of these have nothing to do with white people. Kids whose parents are from India or Pakistan (or those who are born there and immigrate) have to resolve old-world issues completely out of context of shared geography. Korean kids and Chinese kids are superficially grouped into “Asian” here in North America, but there is significant conflict between the countries of China and South Korea, conflict which is compounded by the fact that most others don’t know enough to differentiate between these two groups. Native Canadians find themselves in much the same condition (at least as far as perception goes) as black people, and yet there is very little camaraderie between the two groups.

The fact is that the racial conversation is very real for groups that don’t fall into a black/white or the _______/white dichotomy. James Sweet, a blogger from Rochester, NY, recently posted a piece asking why we use the phrase “person of colour”. After all, everyone is a colour – white people aren’t actually ‘white’. Why do we cling to this ridiculous nomenclature that seems to divide the world into white and non-white?

I responded in the comments to suggest that the reason for the term is because there are issues that are relevant to non-white people as a classification, but that referring to them as “non-white” reinforces the subtle idea that white people are the “default”, whereas everyone else is a deviation from that standard. To forestall the predictable objection that ‘nobody really thinks like that’ – yes, they do. A lot. I recently had a meeting with someone who I hadn’t met before (well I had, but she didn’t remember me). When I arrived, she walked right past me. I introduced myself, and she was shocked. “I was expecting you to be short and Irish,” she said. Now far be it from me to suggest that this says anything negative about this person, she’s really very nice and quite professional. It is simply that her assumption, based partially on my name and partially on the job title I have, was that I would be a white guy. So much so that it didn’t even occur to her that the black guy waiting outside her office was her 11 o’clock.

So we use “person of colour” as a way of describing a sociocultural phenomenon of existing in contrast with a dominant political majority group, without implicitly elevating that group. It’s a subtle rebranding that helps to erode one of the subtle nuances of endemic racial bias, rather than simply being an arch-PC term to avoid hurting feelings.

However, and this is really the subject that this post is about, there are far more numerous racial dichotomies that we as PoC deal with every day that have nothing to do with white people. In this particular case, treating PoC as a homogeneous group does us a significant disservice, because it accomplishes a counterproductive elevation of non-PoC, while necessarily neglecting the fact that the group is not a group.

So why don’t I spend more time talking about these other conflicts along racial lines? If I recognize that the black/white dichotomy is an oversimplification of race and race issues, why not do my part as an anti-racist commenter (if I may be so bold as to describe myself that way) and focus on these other issues? Part of my reluctance to wade into those other conflicts is that I don’t have any connection to them. I grew up observing the black/white dialogue, since it was relevant to my life personally. Simply being a PoC doesn’t grant me some kind of magical insight into cultures that are not my own, except insofar as I recognize those elements that are common to my own history.

I regret that I wasn’t able to make this issue more explicit during my talk, because I may have seemed to grant license to treat the black/white issue as either emblematic of the totality of the race discussion; or worse – I may have suggested that only black/white racism is worth discussing. I certainly did not intend to convey that, and I’m hopeful that anyone who was at the presentation or who watched it online didn’t carry that impression away.

Like this article? Follow me on Twitter!

“I believe that…”: when to ignore someone (pt. 2)

A couple of weeks ago, I talked about a couple of catch-phrases that immediately raise flags in my mind and allow me to ignore the rest of the argument. A line of reasoning that is based on any logical fallacy reminds me of one of my favourite Bible passages (yes, I have favourite Bible passages):

(Matthew 7:26-27) And every one that heareth these sayings of mine, and doeth them not, shall be likened unto a foolish man, which built his house upon the sand: And the rain descended, and the floods came, and the winds blew, and beat upon that house; and it fell: and great was the fall of it.

Of course, in the above passage, Matthew is talking about anyone who doesn’t follow the teachings of Jesus, but the parable is still useful in describing what happens to arguments that are built upon faulty premises. I recall a conversation with my father about the value of theology. His position was that it was a valid field of inquiry, based on logic and reasoning. I told him that when it is based on an assumption that is illogical and lacks evidence – assuming the truth of that which it wishes to prove – it is a masturbatory exercise only.

Such is any argument that starts with the phrase “I believe…”

I find this strategy pops up again and again when talking about religion, but also when talking about pseudoscience, alt-med nuttery, and basically any time you find someone on the left in a debate about anything. When put into a corner, the wheedling cry comes up as the preface to a long series of assertions. Of course, you can’t attack those assertions, because it’s what that person believes. They don’t need proof!

I am reminded of a “debate” I saw between the Australian skeptic atheist who goes by the online alias Thunderf00t and Creationist Bobo-doll Ray Comfort (for those of you who don’t know, this is a Bobo doll). Comfort is a master of typical creationist tactics. First, he unleashes a barrage of terrible arguments that have been refuted a thousand times before (the refutations of which he’s also heard a thousand times before). When the patient skeptic opposite him tries to take one of them on, Comfort backpedals into arguments from incredulity (based on an intentional misunderstanding of science, particularly biology – “do you really think that humans could evolve from frogs?”), which eventually turns into a reducto ad mysteria, where he asks for the answer to a question that nobody has solved:

When the skeptic opponent answers honestly that we, as a species, have not yet discovered the answer to how life started, or what existed before the Big Bang, Comfort then asserts smugly “well I know the answer.” The answer, by the way, is always Jesus.

The problem with a statement like that, aside from its complete and utter vacuousness, is that it’s false. Ray Comfort doesn’t know how the universe began. He has a belief that is based on a particular interpretation of a particular version of history from a particular tribe in a particular region of the world. To know something means to have evidence of that thing’s truth. Ray Comfort doesn’t have any evidence of anything, just his half-baked belief system (I say half-baked because he clearly doesn’t even understand the scriptures he quotes from).

I recall another conversation with my father (he comes up a lot in topics like these, as he has studied theology) wherein I was trying to explain to him that simply believing something does not grant it some kind of legitimacy, and that it was necessary to test beliefs with the scientific method. People are capable of believing a great many things, many of which are untrue. His response was that science isn’t the only way to know something.

I was too stunned to respond. What I should have said is that while science might not be the only way to know something, it was definitely the only way to find out if it was true or not. Theology (the subject we were debating) is built upon the premise that a deity exists, and then uses (and misuses) the rules of formal logic to work out “proofs” of its position. The problem with this kind of internally-valid “reasoning” is that there is no basis for establishing whether the premises are true. For example:

1. X exists
2. If X exists, it has properties of A, B, …, Z
3. Therefore, X has properties of A, B, …, Z

The problem with this argument is that we have no reason to trust the truth of Statement 1. Statement 2 might be entirely reasonable. It may necessarily follow that if God exists then He has the properties of omniscience, omnibenevolence, and omnipotence (it emphatically doesn’t, and the prospects are mutually exclusive, but whatever let’s just pretend), but that is not proof that such a being exists in the first place. It is not sufficient to assume the existence of that which you are trying to prove, however convenient it may be. You have to find a way to demonstrate it through observation – this is the scientific method.

Getting back to the original topic of this discussion, when someone says “I believe that Y is true”, or in Ray Comfort’s case when he simply asserts that he “knows” that Y is true, based on the assumption of the truth of X, they haven’t given the listener any useful information. All they’ve done is state a personal prejudice. Without the ability to point at some body of evidence and say “I draw my conclusion of Y from this collection of facts”, it’s about as useful as saying “Neapolitan ice cream is better than pistachio.” My usual response to such statements is to say “that’s nice that you believe that. So what?”

Needless to say, I don’t have a lot of second dates 😛

Accommodation vs. Confrontation

I suppose I have been remiss in formally commenting on one of the major debates currently going on in the atheist/agnostic/secular movement. There is a camp of people that thinks that the pathway to achieving the goals of a secular world is to work hand-in-hand with religious groups, and avoid offending the sensibilities of the religious at all costs. This camp believes that the path to peace can only be achieved if atheists are perceived not as a threat, but as welcome allies in the struggle to achieve a more stable, democratic society.

The other camp wants the first camp to STFU and GTFO.

This debate has been colloquially referred to as “accommodationism” vs. “confrontationalism”. Accommodationists want to work with religious people and find ways to ally the goals of the atheist movement to those of the religious movement, being very respectful at all times of the beliefs of others. Confrontationalists think that the path to achieving the goals of the movement is to assertively articulate our position and push on both the legal system and the large unengaged middle to highlight the important issues and bring about large-scale change.

There is currently a dispute, some might call it a fight, over which of these approaches is the correct one. I have not yet, at least in print, expressed which camp I ally more closely with.

Before the “big reveal”, I want to talk about a similar situation that was happening during the Civil Rights movement in the United States in the 1950s and 1960s. If I may be so grotesque a mangler of history, we can contrast the approaches of Martin Luther King Jr. and Malcolm X as the “accommodationalist” and “confrontationalist” camps (respectively).

I will say at this point that Malcolm X began his political career as the mouthpiece of a fundamentalist Muslim who advocated mass conversion of black people, and complete segregation of those people from white America – essentially establishing a self-contained religious theocratic state within the USA. Martin Luther King Jr. was no saint either – he was happy to use segments of Christianity as justification for his struggle, without acknowledging the fact that it was that same philosophy that was used to justify the enslavement and systematic oppression of the very people he was fighting for. The two men were not really fighting toward the same goal, except insofar as they were both interested in increasing the autonomy and independence of black Americans. However, for the sake of convenience and familiarity, I hope you will allow my somewhat ahistorical comparison.

It was directly due to the influence of MLK that the Civil Rights Act of 1964 was signed, granting equal protection and access under law to people regardless of their ethnicity. His doctrine of non-violent resistance and co-operation with white leaders and people, coupled with his amazing powers of public persuasion and charisma reached out to all corners of society, even those who might not otherwise agree with the aims of the movement.

As a contrast, Malcolm X was far more militant (in the literal sense, not this ridiculous pap of “militant atheism” that basically just means speaking your mind directly and unashamedly) and confrontational than his counterpart. He famously disdained the inclusion of white people in the black nationalist movement, referring to them (using the language of the Nation of Islam under Elijah Muhammad) as “white devils”. He galvanized his audience – disillusioned and disheartened black youth – by presenting them with a vision of black people as a group under oppression, rather than as a lesser race. He advocated disciplined uprising against the current socioracial system, albeit under theocratic direction.

What those who favour staunch accommodationism are suggesting is that the contribution of Malcolm X, namely the doctrine of black power (which I will take a moment to say should not be contrasted with “white power”, an entirely different concept), was not valuable and/or necessary to the civil rights movement – a claim which is far more ahistorical than my own admittedly crude analysis. The Nation of Islam and its confrontational doctrine accomplished two simultaneous goals. First, it unified and attracted black youth to a cause that was, to many, viewed as just more political posturing that would not improve the day-to-day reality of being black in America. Second, it terrified the white establishment out of its complacency and forced them to find alliances in the black community that would show their sympathy to the cause.

Failing to recognize the influence that black nationalism, which experienced several resurgences (most notably in the 1970s under the Black Panthers, the 1980s in the burgeoning hip-hop movement, and currently with the rise of anti-racism and afrocentric black intellectualism) played in the establishment of civil rights is painting a picture of history that is fundamentally doomed to repeat itself. This is happening currently within the atheist movement. Phil Plait, alongside Chris Mooney, Sam Harris, and other prominent atheists, seem to take an approach that accommodationism is the path toward mainstream acceptance, whereas confrontation is unwelcome and pushes the atheist movement backward.

I have used a metaphor that is unfortunate in its level of violence, but apt in its ultimate meaning. Imagine a battlefield between two opposing forces, one force with both infantry and archers, arrayed against one that is purely infantry. As the two footsoldier contingents meet in the middle, the unbalanced force is cut to ribbons by the arrows of the archers, resulting in a trouncing. Similarly, one that is purely archers would be overrun by brutes wielding swords. However, two equal opposing forces are forced to use tactics and real strength to prevail. The hole in this analogy is, of course, that people die in war. Nobody is seriously proposing that atheists be killed, nor would any self-respecting secularist call for the violent removal of the faithful.

The point of this analogy is that different people are persuaded by different things, and to use only one tactic (either accommodation or confrontation) will result in the rapid trouncing of the atheist/secularist movement by the religious, who use a variety of methods to advance their points. However, when the opposing forces are balanced in their armaments, the battle is decided by that which remains – the evidence. In that case we win, because by definition the evidence is on the side of the skeptics.

We need the Malcolm X school to bring apathetic atheists out of the closet by pointing out the evils and influence of the religious establishment, and to put the fear of the godless in the believers. To balance that, we need the Martin Luther Kings of the movement to be reaching across the aisle to find mutual ground with the more moderate and freethinking elements within the theist camp. Saying that one group is counterproductive is short-sighted and foolish – buying into the fear and discomfiture of the oppressors to justify throwing your compatriots under the bus.

As a caveat to this diatribe (which has gone far beyond the TL/DR barrier, for which I apologize), it is important to recognize that even the two paragons of accommodation and confrontation recognized the need for balance. MLK often expressed his contempt for the philosophy of “gradualism” – the idea that human rights should be given out slowly over time, to protect the oh-so-sensitive feelings of racist whites. After his Hajj, and after leaving the tutelage of Elijah Muhammad, Malcolm (at this time known as Malik El-Shabazz) began to reach out to non-black people who expressed a desire to advance the cause of black nationalism. Tragically of course, the fundamentalists on either side of the debate weren’t having that, and both men were assassinated.

The fact is that in the struggle for civil rights, there must be both a carrot and a stick; a voice that pulls dissenting groups together, and one that drives the points forward without fear. I was not alive at the time, but I can’t imagine that MLK didn’t have at least one (and likely hundreds) of conversations with concerned white people saying “that Malcolm X is driving the civil rights movement backward by alienating people!” We know from transcripts of his speeches that Malcolm had a great deal of contempt for those he viewed as selling out the Negro birthright to capitulate to the white man. The forces worked in opposition, but toward the same ultimate goal. How much more powerful would the atheist/secularist movement be if we stopped this petty (and meaningless) squabbling among our own ranks and instead marshalled our respective forces toward the ultimate goal of a society in which we are free to have our own opinions, regardless of dogmatic interference of any kind?

TL/DR: Much like Malcolm X’s confrontational style was a necessary balance to Martin Luther King’s accommodationalist style, the respective philosophies within the atheist/secularist movement are both required for the long-term progress towards civil rights. Failing to recognize this is a weakness within the movement.

Like this article? Follow me on Twitter!

“I’ve done my own research” and “Common Sense” – when to ignore someone

In my random flittings about the internet, I come across many discussion forums. The great downside of giving everyone the tool to voice their opinion, is that we’ve allowed every tool to voice their opinion. Without wanting to sound like too much of a snob, there is a meaningful connection between formal education and the value of your contribution to a discussion. To forestall the predictable rejoinder (I would make it myself at this point), I am not saying that only people with PhDs are worthwhile; nor am I saying that someone with a PhD is necessarily worth listening to. What I am saying is that during the process of formal education, particularly philosophy and law, one learns the rhetorical tools required to construct a coherent and logical argument (if you have a degree in philosophy or law and don’t know what I’m talking about, go the hell back to your school and demand a refund).

As a side-effect, it becomes easier to recognize those arguments that are spurious and based on emotive “reasoning” rather than evidence or logic-based induction/deduction (again, if you don’t know the difference, go take a philosophy course, or get some tutoring). In a post that now seems ancient, I described some of the tools commonly used by the forces of stupid that try to substitute for logic. When you’re unfamiliar with common logical fallacies, you’re more likely to be persuaded by them – it’s like not knowing which berries in the forest are poisonous.

However, there are two that I’ve seen cropping up that start my eyes a’rolling.

1. “I’ve done my own research on this, and…”

I don’t know who finds this argument persuasive, but it immediately turns me off ever listening to that person. The internet has given us many wonderful things, but many of those things have a dark side. For example, we have unprecedented access to information – anyone with an internet connection has immediate access to the collected knowledge of the human species in ways that were barely even imaginable when I was a kid. I remember having a World Book encyclopedia set in my elementary school library. Someone had stolen, or lost, or destroyed, the S section. As a result, I didn’t know what a salamander was until I turned 21 (note: this story is almost entirely fabricated). The point is that we are no longer reliant on schools to give us knowledge or facts – it’s all available at our fingertips.

The downside of that is, of course, that not all facts are created equal. Cruise any creationist or white supremacist or climate change “skeptic” web forum and you’ll find lots of things that people call facts. The challenge is in discerning between things that are factual, things that are plausible, and things that are simply nonsense or fabrication. This is the realm of critical thinking, a skill which I find is in all-too-short supply.

So when someone tells me that they’ve “done their own research”, that is not persuasive to me at all. Actual research requires training in certain methodologies, which most people don’t have. Further, you have to be trained in the right methodology. Being trained in the scientific method, for instance, gives me some confidence that I can read and critically analyze a scientific study. None of that makes me qualified to critique someone’s interpretation of history – I’m not a historian. My opinion on matters of history, or philosophy, or even science, based on my own “research” is likely to be incredibly faulty and limited by both my training and my years. This is why the scientific consensus is such a powerful thing, and why anyone who wants to challenge it should come in with buckets of evidence, not simply vague accusations of conspiracy and lots of capital letters.

There’s also a metric assload of biases, heuristics, prejudices and other manner of cognitive problems with someone “doing their own research.” Oftentimes people will have an idea fixed in their head, and go looking for evidence to support it. I know I’ve caught myself doing this before. This isn’t ‘research’, this is confirming your own biases. True research sets up systematic mechanisms to control for and try to eliminate these biases, and it takes time and training to learn how to do this properly.

I’m fine with someone saying “I’ve done my own research…” as long as they’re able to point to it and show me. There’s no excuse besides laziness for demanding that someone believe your opinion if you can’t show your work. Any of the opinions I put up here are subject to the same scrutiny, and if chased down, I’ll either go to my source material or admit that I’m just making stuff up that seems logical. What I won’t do is say “well I’ve looked into this, and these are the facts, and you have to believe me because I say so.” Anyone who does that should be ignored right out of the starting gate.

2. “It’s just common sense that…” or “Common sense dictates that…”

Of all of the stupid arguments I come across, this one has got to be the worst. “Common sense” is the most inaccurately-named concept out there – it’s not common at all, and it’s rarely sensible. Appeals to common sense assume that there is some universal filter through which human beings see the world and is ‘common’. The reality is that depending on your upbringing, your education, your experiences, and your specific training in fields like logic and rhetoric, you build for yourself a pretty thick filter through which you receive information. This is done partially to take some of the workload off of your brain – if you can classify things quickly and easily, it free up resources to do other things (ever been exhausted at the end of a lecture on a topic with which you weren’t familiar?)

Our filters exert a great deal of influence over our thinking. That’s why it’s “common sense” to me that scientific studies are better than a list of patient testimonials – I’ve seen lots of examples in my own life and in other circumstances in which people will misattribute the placebo effect to whatever quack treatment they receive. However, it seems that to many chiropractors, or homeopaths, or reflexologists, and yes even licensed physicians, patient testimonial trumps science. It’s just common sense, right?

Appeals to “common sense” simply say to me “I haven’t bothered to spend any time or effort to think about this, or to look to see if there is any evidence of it, but I believe it anyway, so I’m going to assume you make the same assumptions about the world that I do.” I lived in Ontario during the reign of Premiere Mike Harris, who gutted education spending, closed hospitals, fired nurses, and basically ruined the shit out of social services. It took years for the province to recover, and some things are still in the can to this day. He called his policy “the Common Sense Revolution”, which is why I get chills every time anyone tells me that they wish people would “just use common sense.” I want fewer people to use common sense, and more to use some friggin’ evidence please.

If you don’t have evidence, but you think your position is reasonable, it’s fine to say so. But again, you have to show your work. If you can (like I try to do with all of these Monday thought pieces) walk your audience through your logic, then you’re not using ‘common sense’ any more, you’re using reason. There’s nothing wrong (and a lot right) with doing this. It is a lot more difficult and time-consuming, but you’re more likely to a) convince those who disagree with you, and b) find errors in your own thinking if you do things this way.

So if you’re going to try and convince me that you’ve got answers based on either your “own research” or your “common sense”, try not to be offended or surprised when I laugh, and put on some headphones until you stop making noise out of that hole in your face.

TL/DR: Real research takes training and understanding, and “Common Sense” is neither of those things. There are ways to present an argument persuasively, but invocations of either of those things do not impress me.

Like this article? Follow me on Twitter!

What it means to ‘replace’ science

Not too long ago, I had a conversation with a friend of mine about the dichotomy between science and religion. His position was that we can’t rely on certainty in anything, since our understanding of the universe is constantly changing. Because of this, he reasoned, faith in the supernatural is just as valid as the use of scientific evidence. I had a similar conversation with another friend a few months later, who was trying to convince me that medical woo-woo might be validated someday because the nature of science was “constantly changing”.

This position is, at best, only trivially true if you consider all forms of change to be exactly the same. Even though I walk 5 km towards work every morning, I will never end up 10 km away from work. Even though my position is “constantly changing”, I’m not jumping all over the place at random, hoping eventually to land at my office. Our understanding of the universe and the processes that hold it together similarly does not fluctuate at random – it is modified by progressively better evidence. So while the statement “science is constantly changing” is true, it is true only in one specific way.

My first friend brought up our understanding of physics as an example of how things might be completely different in 25 years (this was after many drinks, so I’m going to go easy on him). His position was that while we “know” that F=ma today, we might have an entirely different understanding of the relationship between force, mass, and acceleration. He cited the re-orientation of the world once quantum physics was better understood as an example of how science can be replaced with newer understandings.

“Bullshit,” I replied. “Einstein didn’t ‘replace’ Newton; he showed where the limitation of Newton’s mathematics were, and provided a guide for how to overcome them.” In order for Einstein to ‘replace’ Newton, he would have to provide sufficient evidence of events or occurrences where F did not equal ma – in other words, there would have to be overwhelming evidence to show that F only coincidentally equals ma. What Einstein did was show that Newton is true within a specific range of phenomena. The fact is that Einstein’s equations had to continue to describe the phenomena that Newton’s did; the fact that they agree perfectly is a testament to Einstein’s genius.

Perhaps a better illustration of this is the competing theories of evolution in vogue 160 years ago – those of Darwin and Lamarck. Darwin’s theory is familiar to us all – environmental changes favour the survival of certain individuals in a population to survive and breed. Lamarck’s theory was that environments imprinted changes on individuals, who passed traits on to their offspring – for instance, giraffes have long necks due to stretching to reach tall leaves. While it sounds ridiculous now, it certainly fit the available evidence (DNA or modern genetics were not understood, and heritability of traits was well-documented). Presented with two competing theories, biologists of the day looked to see which one matched the evidence best (Darwin, of course, had the advantage of basing his theory on years of carefully-collected evidence).

Since then, many developments have been made in biology. The discovery of the structure of DNA, for example, led to a greater understanding of where variation in species came from, and how mutations occur. Advances in technology have enabled us to measure climate changes and global events that happened millions of years in the past. The tree of life has been re-drawn (one of the few examples of a time when science has been completely re-understood, but the old tree of life wasn’t based on rigorous science, simply some guy looking at things and giving them names) to reflect new understandings in the common ancestry of all life. Changes have been made to Darwin’s original theory in light of evidence that wasn’t available to him at the time. None of this means that evolution has been replaced, any more than the 26 year-old version of me is going to “replace” the 25 year-old version of me on my birthday (which is coming up soon – please give me many presents). It is a development that refines and build upon the understandings of the past.

Hence my objection to the idea that science is “constantly changing”, and therefore is only selectively valid. This attitude comes from a fundamental misunderstanding of what “science” is – one that I have talked about before. Science is not merely a list of facts in a dusty book on a shelf – it is a process that involves taking a bird’s eye view at a group of facts and organizing them into a central concept that can be tested for validity. Any change in scientific understanding must, at the very least, continue to explain those things which have already been observed to be true. It has to be able to explain all of those things that have observed to be true, not simply cherry-picking those facts that agree and neglecting all of the contradictory evidence.

This is why I am confident making statements like “God isn’t real” or “homeopathy doesn’t work” or “vaccines don’t cause autism.” Woo-woo supporters are quick to pipe up “you can’t know that for sure”, demanding the impossible proof of the negative. Claims about an intervening supernatural being, or the (selective) memory of water, or the supposed link between vaccination and developmental disability would require a completely new understanding of physics, physiology, biology, and a handful of other ‘-ologies’ that are based on a wealth of evidence. “Science is changing all the time,” they whine “so we just may not know how it works yet.” Once again, I say unto them “bullshit.” Not only is there insufficient evidence that reiki, or intercessory prayer, or cell phones causing brain cancer, are in any way factual, in order for them to be even plausible, we’d have to invalidate everything we have learned about reality so far.

So while developments can, have been, and will continue to be made in scientific fields, they work in a linear fashion as long as we continue to follow the evidence. It is because of this that I am satisfied to put my trust in this method, rather than one based on faith or magic.

TL/DR: New discoveries don’t “replace” older ones, they add to an always-growing body of evidence that help us to understand the world. Woo-woo theories require us to throw out the evidence, or at least pretend it isn’t there.

http://www.bbc.co.uk/news/technology-11100528

Why do I want to take religion away?

People who argue against the influence of religion, and argue for its separation from public life (which Jesus also did, by the way, for those of you who actually bothered to read scripture), are commonly asked the same question: why do you want to get rid of religion? Atrocities have occurred throughout history committed by people who tried to outlaw religion (Pol Pott, Stalin, Mao) – why would you want to go down that path? Surely outlawing religious practice would lead directly to the same atrocities!

Unlike other arguments that I present and then ridicule, this argument actually has some merit. History has indeed shown us what happens when you try to force a belief system upon a group of people, whether it be state-sponsored atheism or state-sponsored religion. Horrific deeds are the result when you try to control someone’s mind. The problem with the argument is that it makes an erroneous assumption: that I (or those with similar viewpoints) want to get rid of religion.

I will state here unequivocally that I have no interest in taking religion away from people, even if such a thing were possible. Religion, like racism (and herpes) will be around in some form or another regardless of legislation or acts of physical force, and will just keep cropping up here and there. However, even if I could somehow mandate the removal of religion, I would not. I have no right to make decisions on someone else’s behalf – respect for individual autonomy is a fundamental tenet of ethics.

So why write all of this stuff then?

There is a common misconception that people who argue against the influence that religion has in public life are somehow trying to take away their ability to believe what they want. This is the same line of reasoning used by people who accuse affirmative action advocates of taking away jobs from white people. It comes from a mindset (which I’m sorry to say  seems to be held pretty much exclusively by conservatives) that the way the world is now is the way it is supposed to be. White people are at the top of the heap worldwide? Ah, well that must be their manifest destiny! Christians dominate the political spectrum? It must be God’s will.

This is inherently built in to the concept of ‘conservatism’ as opposed to ‘progressivism’. Conservatism, by definition, is about holding on to and maintaining traditional structures and events. In and of itself, this isn’t a bad thing. Some traditions are important to maintain, in order to understand where we came from – go to a military parade exercise and look at the seemingly-archaic procedures of marching and saluting. However, when we take a nuanced view of traditions, we understand that some of them need to be updated to reflect present-day reality. Conservatives deny this, instead fighting to maintain the status quo.

Some people who identify themselves as ‘conservative’ will say that the conservative movement is about maintaining individual autonomy, and refusing to capitulate to societal pressure or government shows of force. This philosophy is correctly called Libertarianism, and for reasons that I can’t quite fathom it has been rolled up in the conservative platform. Libertarianism stands opposed to collectivism (or authoritarianism – a rose by any other name…), and should not be confused with conservatism. In the same way, many people who identify themselves as ‘liberal’ (myself included) do not see themselves or their values reflected in the communal-authoritarian or arch-relativistic philosophy of progressivism. While their/our beliefs may often overlap with those in the liberal movement (gay rights, public education, health care), there are things to which they/we voice strong objection (health “freedom” woo, the role of business, religious “tolerance”).

What does this have to do with anything?

Humankind, like anything else, must constantly adapt to reality as things change. This philosophy is perhaps best encapsulated in the Taoist tradition, in which one is exhorted to be mindful of the flow of the universe (the Tao), and instead of resisting its direction, to allow one’s self to move in harmony with it. This adaptation and change is necessary for survival – as we know from evolutionary biology, those species that cannot adapt, die. If we want to survive as a species, or as a society, or as individuals, we must learn to respond to environmental/social/political challenges and find a way to live with them.

This need for change stands diametrically opposed to the religious/conservative philosophy (small wonder that those who oppose the teaching of evolution are almost exclusively conservative religious people), in which the status quo must be preserved. If the world works this way for a reason, then any attempt to adapt the way we do things is a betrayal of the order of the universe. Change is bad, and so are those who advocate it.

Religion is an impediment to human progress. It is the yoke around our necks that slows us and prevents us from being able to adapt and explore and challenge new frontiers. While sometimes progress needs to be examined closely through the eyes of caution (life-extending technology is perhaps one example), that is not the same as standing as a roadblock to progress at every opportunity. Sometimes (in fact, often), rapid response is needed to relieve or prevent human suffering, and when we have to wrangle at every step with those who refuse to accept rationality or observed reality as truth, suffering is prolonged. The problem with simply throwing up our hands and agreeing to disagree, is that one of these philosophies is trying to kill us.

So should we abolish religion?

I don’t think it is generally advisable to abolish religion. It’s definitely not a good idea to outlaw certain types of belief. That is merely substituting one form of tyranny for another. This seems to be the fear of religious people in the face of secularism – that somehow they will be persecuted and forced to recant, or prevented from practicing their beliefs.

Nobody is advocating this position – not seriously, anyway.

But there needs to be an admission on behalf of the religious community that curtailing the outrageous level of privilege that religious belief has enjoyed over the past few thousand years is not the same as oppressing religious people. Currently, being a “person of faith” is somehow seen as a virtue, and piety is confused for righteousness. Religion has become a qualification for public office (thankfully not so much here in Canada, but that may be changing), and school boards everywhere are becoming entrenched in fights that are ideological, rather than fact-based.

When we no longer accept religious beliefs as valid arguments, and instead rely on evidence and logic, we are better-suited to adapting to changing reality. The founding fathers of the United States understood this, which is why they expressly forbade religious involvement in legal and political matters. Sadly, this has been slowly and steadily eroded to give us a system wherein Sarah Palin is taken seriously when she says we have a Judeo-Christian heritage in this society. However, the principle still stands. If we are able to move back toward such principles, in which superstition is not granted equal time to fact, we will be in a much better position to address the challenges that we face today as a species, and the ones we will undoubtedly face anew tomorrow.

TL/DR: While I do not think it is a good idea to outlaw religion, I would like to see us move toward a system that does not grant it the special privileges it currently enjoys. Also, a bunch of stuff about how conservatives are trying to kill us.

Like this article? Follow me on Twitter!

Oh good, Canada still uses slave labour

I don’t even know what to say about this one:

The B.C. government has terminated a contract with a Surrey forestry company after 25 workers – many of them immigrants from the Congo – were found living in substandard conditions near Golden in late July.

That’s not even the bad part; this is:

Most of the 25 workers had travelled from eastern Canada for jobs clearing brush near Golden. They were living in a bush camp and complained of a lack of food and inadequate facilities, a church worker in Golden told The Vancouver Sun. And the workers told government officials they were not fully paid and on the job seven days a week.

Slavery makes good economic sense. It’s even practicable – get people who have few options, take them away from any resource they’d have to achieve alternate employment, then bully and threaten them into accepting low wages (or no wages). When they have no other options, they’ll take whatever they can get. It’s the ultimate victory of free-market capitalism: get as much as you can for as little expenditure as possible.

But then of course, there’s the whole thing about being evil. Inconvenient, eh?

I try to make these posts have a bit more relevance than simply linking you to news items I find in the paper. There’s an underlying theme here that I think is interesting, but most of you probably won’t like. There’s a hip-hop artist called Ras Kass who released an amazing album back in 1996 featuring a song entitled The Nature of the Threat.

Warning: language and content advisory

Nature of the Threat is an interpretation of history whose thesis is essentially that white people are inherently evil – highlighting the atrocities perpetrated by whites throughout history. It’s quite a task to separate the fact from the fiction in the song, but there are a number of points that deserve exploration and discussion (Euro-centric teaching of history, the legacy of systemic racial discrimination at the hands of Europeans). I like the song, even though I disagree with many of the components, and doubt the validity of the thesis. The above story makes me think that slavery has nothing to do with the colour of people’s skin, merely a desire for power and the opportunity to exploit others. It is an unfortunate coincidence that many of the workers are black Africans, but the business owners are not white:

Khaira owner Khalid Bajwa said he has been treated unfairly by the ministry, who didn’t give him an opportunity to correct any camp deficiencies. “I don’t know why they are complaining. We never had problem with our camps. It is a bush camp. It is not a tourist camp,” he said. “We were setting up the camp. We had just moved there.”

Of course Mr. Bajwa’s story paints only part of the picture:

Quesnel native Christine Barker, 24, had worked in the woods for other companies for five years without incident. The single mother said Tuesday she has never dealt with abuse like what she experienced at Khaira…

“When we started the work refusal, that’s when the camp conditions got even worse – showers were denied. … We were refused food because we weren’t working for him at that time.” She said she witnessed a supervisor threaten to kill one of her Congolese co-workers and throw a knife at him.

Sounds like slavery to me.

The point is that while we can blame white Europeans for a lot of the problems in the world, we can’t do so based on the colour of their skin. There’s nothing genetically cruel or inhumane about white people, just as there is nothing genetically lazy or stupid about Africans. People are people, and given the right set of circumstances and motivations, they will commit the same atrocities, or acts of kindness, or feats of inspired genius. The situation we have now is merely a product of how things shook out in the world. We cannot rely on the inherent goodness or evilness of people, we must realize that the situation determines out behaviour better than we suspect, ensure that all people have equal access to protections under the law, and then work to ameliorate those situations that lead to destructive or oppressive behaviour.

I feel motivated at this point to make an unequivocal statement that I don’t have any particular animosity toward white people. As a sometimes student of history, I recognize that the story of our world has been filtered through a European lens, and that my white friends and family members are victims of the same system that I’ve been speaking out against. Those of you who know me personally will be able to attest to this. For those of you who don’t, you may read through the rest of my writings (particularly last Monday’s post) if you doubt my sincerity. If I have caused offense, please accept my apology (and tell me so in the comments).

Like this article? Follow me on Twitter!

Being creative without a Creator

A friend sent me a link to a 20-minute talk on creativity by Elizabeth Gilbert, author of the novel Eat, Pray, Love. I’m not a big fan of the book (I got through about 25 eye-rolling pages before giving up and reaching for the remote), but I am a big fan of (my friend) Claire, so I gave it a chance. I was right with her up until 8:30 when she started in on “creative mystery” and an external, supernatural source for creativity, and then the rest was invocations of magic and self-indulgent privileged pap, the likes to which Jim Carrey would be a fervent subscriber.

I do not know if Claire’s intent was to murder my neurons; I doubt that she was trying to lobotomize me through the intarwebz. She did ask me to write about some of my thoughts on the creative process from the perspective of an atheist. I suppose I have some claims to qualifications in this regard, given that I do spend the non-science half of my life playing and creating music. I’d like to share some of my thoughts on this subject, but first I want to address some of the themes that came up in Ms. Gilbert’s talk, which is available below:

Is suffering necessary for creativity?

A commenter on my strangely-popular “I am not my ideas” post from a few months ago brought this up. Some of the greatest artists of all time (think Van Gogh, Beethoven, Vonnegut, the list goes on) have suffered, and from their suffering came their genius. The image of the tormented artist is so common as to have become almost completely cliché. Douglas Adams satirized this phenomenon in his Hitchhiker’s Guide series, in which time travel inadvertently robs the galaxy of one of its greatest works of art by making the artist happy. Of course, we have to remember that Douglas Adams was a creative genius, and was not particularly unhappy. Nor, by all accounts, were Bach, Shakespeare, da Vinci, John Lennon, this list goes on as well. While suffering can yield insight that can bring creativity forth (and in my experience it is much easier to write albums when you’re sad than when everything’s awesome – just ask Matthew Sweet), it is not necessary to suffer in order to bring forth great works.

Is the supernatural the source of creativity?

Ms. Gilbert spends some time talking about daemons or geniuses, supernatural embodiments of inspiration that are the conduits between the artist and the divine. As with all supernatural agents of causality, there’s no evidence for the existence of faeries (which, to her credit, Gilbert admits). Being a musician, I can testify that inspiration does seem to come from nowhere. I’m sure that other artists and musicians have a much more palpable experience of inspiration than I do (things kind of just pop into my head, rather than being overcome by a ghost that demands me to have a pencil in my hand). However, given the diversity of ways in which inspiration strikes people, and the fact that it hits some people more often than others, and that to all appearances it strikes at random, it’s safe to say that inspiration is not likely caused by a supernatural force for which there is no evidence.

Subjective experience vs. objective reality

Our brains make a fundamental error when it comes to subjective and psychosomatic experiences. Because we interpret the outside world through our senses, we confuse sensory experiences with reality. So when, after meditating for an hour, we feel connected to an external loving presence, that does not constitute evidence that that presence exists in reality. Don’t get me wrong – there is a lot of value in subjective experience. Feeling connected to the world, or to nature, or to your fellow human beings can bring you a sense of happiness and motivate you to be a better person. However, to make the leap from feeling something and then assuming that it exists requires non-subjective proof. To wit, just because artists feel an external force driving them to create doesn’t mean that there are muses or daemons or disembodied geniuses that explain it.

Gilbert would like us to return to the days of magical thinking, in which we attribute inspiration to outside ethereal forces. Reality is all well and good, she seems to say, but we’d feel a lot better if we pretended there were invisible spirits whispering in our ears. If we screw up, well it’s the fault of the spirits. When we succeed, attribution to the spirits will prevent us from getting swollen egos. Who cares if it’s all a lie if it makes us feel good? You can probably tell I’m not a big fan of self-deception, even when it’s practical. It might comfort us to lie to ourselves, but the truth is important. It enables us to deal with each other in a way that reflects the world around us, and prevents us from endangering each other through misinterpretations of reality.

So where do I think inspiration comes from?

There’s a common criticism of skeptics and scientific skepticism that we want to strip the majesty and beauty out of life. Apparently, to some people, understanding how something works makes it less beautiful. Of course, having no idea how something works makes you sound like a complete moron, but that may not be the worst thing in the world. That being said, I still reject the idea that familiarity breeds contempt. I’ve known that stars were inconceivably large nuclear reactions happening in space billions of kilometers away since I was a little kid – none of that makes a starlit night any less beautiful. I’ve known that music is caused by vibrations in air resonating tiny bones within the inner ear causing neuron activity since I was in elementary school – none of that makes me enjoy Beethoven’s 6th symphony any less. I’ve known that there are evolutionary roots for familial love since I was in university – none of that makes me love my parents any less. Understanding the processes behind the world around us can lead to deeper and more beautiful understanding of reality.

We know that the brain is incredibly complex. It adapts to novel stimuli, regulates an incredible number of processes simultaneously, all below the level of what it’s most famous for – conscious thought. It is entirely possible that the way some brains are wired permits a type of lateral thinking that pulls together disperse thought processes that come together to form music. The phenomenon known as synesthesia – wherein sensory input of one type is interpreted as another type (seeing sounds, hearing smells) – certainly supports this conjecture. Some brains might just be better-suited to creativity than others, and ‘inspiration’ may ‘strike’ these brains more often. The arrival of such a strike would be experienced in a variety of different ways. This would also explain why creativity is often (but not necessarily) associated with poor mental health – an atypical brain chemistry and structure will have broad-reaching effects.

Without intending to, Elizabeth Gilbert has paralleled my idea of separating one’s ideas from his/her sense of self worth. I have written songs I’m proud of; I’ve written some stinkers that even I don’t like myself (sadly, far more of the latter than the former). I don’t beat myself up for writing crappy songs, or having crappy performances, in the same way I don’t get a swollen head when something I’ve written makes people cheer. It feels good, but I know that it’s not about me, it’s about the song. I don’t think the song was floating around in the ether, waiting for me to pull it in – that view, if anything, is more arrogant than being glad that my brain popped it into my head. I’m not my ideas in the same way that I’m not my songs – I’m just happy to be able to use my brain to say things in a way that people will listen.

So while I think Ms. Gilbert has the right conclusions in thinking that musicians shouldn’t live and die by their success, and that a rejection of the song or book or painting is not the passage of judgment on who the artist is as a person, she spuriously tries to invoke magic and daemons to make this happen. There are better, non-magical, non-woo-woo ways of accomplishing that goal.

TL;DR – Artistic inspiration can be explained through natural processes, and does not require appeals to woo-woo to exist. The non-magical nature of inspiration doesn’t make it any less wonderful or special.

Like this article? Follow me on Twitter!

The Giving Pledge: Take THAT, free market!

If I had any money, this is what I’d do with it:

The Giving Pledge is an effort to invite the wealthiest individuals and families in America to commit to giving the majority of their wealth to the philanthropic causes and charitable organizations of their choice either during their lifetime or after their death.

Basically, Bill and Melinda Gates are encouraging their rich fat-cat friends to put their vast fortunes to a specific use – financing philanthropic projects. In a gesture that is essentially a big “fuck you” to Ayn Rand, those people who most typify her heroic characters are deciding not to let the free market decide how best to solve the problems of the world. Instead, they’re pledging their (quite frankly) obscene amounts of money to make measurable and concrete improvements in the lives of those who will benefit from it most.

This is far from simply throwing away money to expurgate white guilt or noblesse oblige, as Eli and Edythe Broad note:

“Before we invest in something, we ask ourselves three questions that guide our decision:

  1. Will this happen without us?  If so, we don’t invest.
  2. Will it make a difference 20 or 30 years from now?
  3. Is the leadership in place to make it happen?

Philanthropy is hard work.  Many people think it’s easy to give money away.  But we are not giving money away.  We want our wealth to make a measurable impact.  And after running two Fortune 500 companies, we’re having more fun now – and working harder – than ever.”

I don’t know how many of you have read Rand’s books. In those books, charity is done out of mewling and wheedling obligation to the hordes of lazy poor, hands outstretched. While there’s no doubt that there are lazy poor out there, and while I definitely don’t doubt that much charity is done merely so people feel (and look) less greedy for having money, that’s not an accurate description of general reality. There are real problems out there that can be solved by real investments from real people. It benefits all of us to have as few barriers to excellence as possible – how many Einstein-level intellects may have died of AIDS or malaria? What would have happened if Steven Hawking hasn’t had the benefit of modern medical technology? What if Bill Gates had been born in Harlem or on a reserve?

Anyway, this made me happy, so hopefully it makes you happy too. Also, a quick scan of the letter from the pledges reveals no invocations of Jesus or Judeo-Christian ethics (The Hiltons appear to be Catholic, so maybe I’ve spoken too soon). Seems as though people are happy just to be good human beings, rather than trying to pass through an eye of a needle.

Like this article? Follow me on Twitter!