Sunday, February 28, 2010

"You Do Know That the Flintstones was Only Partially Based on Fact, Don't You?" - Stephen Merchant

There are elected officials who think that this actually happened.

I got the idea for this drawing from reading the Conservapedia page on dinosaurs. For those of you that don’t know, Conservapedia is exactly what you think it is, namely, an online encyclopedia for people who find the entries on Wikipedia to be too liberal.

In future posts I’ll be taking a look at my favorite aspect of this site, the Conservative Bible Project. So, you know ... look forward to that.

For now I wanted to briefly examine this whole issue of dinosaurs.

Obviously, old-earth creationists don’t have any problems with dinosaurs because their view allows for an accurate geological and evolutionary timeline. However, the young-earth creationists have a real problem in that, if there view is true, dinosaurs and man had to coexist. Dinosaurs would have had to be created within the same week as humans, after all.

Now, one would think that something like Tyrannosaurus would bear a mention in the creation story, or perhaps in some historical writing. It probably would have stood out when Adam was naming all the animals.

Original name: Holy $#@!

Certainly, the creationists have a response to this, but I find it hollow. I’ll let you read over the arguments put that they put forward as I think that these speak for themselves (Just for added fun, look over one of my favorite Chick tracts). You could go to talkorigins if you want to refute certain claims.

Is this proof that man and dinosaurs coexisted? No ... No it is not.

The thing that really interests me about all of this is the length to which people will go to make something fit into their belief system. Obviously, dinosaurs were not be written about in creation stories because ancient people did not know of their existence. End of story.

But no, if you believe that the biblical story of creation is truth, then you have to find some explanation for any discrepancy. You’ll search for any anomaly or ambiguous term that might fit and create a whole fantasy around it, regardless of any original intention.

As a comic book guy, I’m used to this. We do this kind of rationalization all the time. Hell, Marvel comics even had a regular fake prize, called a No-prize, for readers who could explain away continuity errors. I can’t tell you how much time I spent trying to figure out a way to justify Batman’s use of guns in his first appearances, when one of the defining aspects of his character was his hatred of guns.

Reality, of course, was that there was no continuity. Batman is a character that has been drawn and written by scores of creative people over seven decades. You can’t really reconcile every error or change to make a coherent whole. It’s just a fun mental exercise.

Of course, when explaining away why Spiderman’s costume had a color change, the best that I could hope for was a mention in the letters column. When you do the same rationalization with religion, it is called apologetics and you can earn a degree.

Thursday, February 25, 2010

Just Another Old Drawing


The Tenebrist from Superhero Mundane

Wednesday, February 24, 2010

Sunday, February 21, 2010

How Christian Were the Founders? - Part the First



Russell Shorto had an article in the New York Times Magazine last week that has been the topic of conversation for my friends and I this last week. It covered the efforts of certain conservative members of the Texas state board of education to propose changes to the social-studies curriculum guidelines. I encourage everyone to read it because this has been a story that I have been following for a while.

For those who are not aware, the Texas state board of education looks into revising one curriculum subject every year. Previously the focus had been on science, so the obvious battle was over the whole evolution/intelligent-design debate (I hate to use the term debate, but it is much nicer than my preferred word). The result was, at best, a marginal victory for science.

Now the subject to be revised is Social Science, which looks to be just as controversial. I’ll let you read the article to understand why this important to all of us (long to short – as goes Texas, so goes the textbook industry), but I did want to focus on a couple of issues that were raised.

The article states:

The one thing that underlies the entire program of the nation’s Christian conservative activists is, naturally, religion. But it isn’t merely the case that their Christian orientation shapes their opinions on gay marriage, abortion and government spending. More elementally, they hold that the United States was founded by devout Christians and according to biblical precepts. This belief provides what they consider not only a theological but also, ultimately, a judicial grounding to their positions on social questions. When they proclaim that the United States is a “Christian nation,” they are not referring to the percentage of the population that ticks a certain box in a survey or census but to the country’s roots and the intent of the founders.

I’ll get to whether or not this view has any validity in a second post, but I first wanted to address this past week’s primary discussion point.

Is this important?

For the sake of argument, let us say that the Founding Fathers were the devout, fundamentalist Christians that some contemporary conservatives seem to believe. Let’s say that they did intend this country to be established on Christian principles. What affect would that have us now? How would teachers address this in the curriculum?

To someone like me, the religiosity of the nation’s founders is an interesting historical footnote, but not very important for how we need to govern now. We, as contemporary citizens, look to these men and their philosophies for inspiration and guidance, but we also recognize that much has changed since their era, perhaps invalidating some of their beliefs. The framers of the constitution recognized the need for some form of institutional flexibility, which is why they allowed for amendments.

But many conservatives do not see things this way:

To give an illustration simultaneously of the power of ideology and Texas’ influence, Barber told me that when he led the social-studies division at Prentice Hall, one conservative member of the board told him that the 12th-grade book, “Magruder’s American Government,” would not be approved because it repeatedly referred to the U.S. Constitution as a “living” document. “That book is probably the most famous textbook in American history,” Barber says. “It’s been around since World War I, is updated every year and it had invented the term ‘living Constitution,’ which has been there since the 1950s. But the social conservatives didn’t like its sense of flexibility. They insisted at the last minute that the wording change to ‘enduring.’ ” Prentice Hall agreed to the change, and ever since the book — which Barber estimates controlled 60 or 65 percent of the market nationally — calls it the “enduring Constitution.”

A small incident, I suppose, but one that reveals the intentions of some of these board members. They want the views of the Founders to be dogma.

Of course, if you are going to teach something as dogma, you better make sure that it fits your belief system. As one of the guys on the Chariots of Iron podcast said, “I’m fine with theocracy ... as long as I get to be God. Otherwise ... ”

And now you see why it is so important to these conservative Christians that their students believe that "the United States was founded by devout Christians and according to biblical precepts."

But, as I will address in one of my next posts, I believe their premise to be faulty.

Wednesday, February 17, 2010

Feeling Sleepy Today ...

The Old Dutch Graveyard

Headless Horseman


Friday, February 12, 2010

The Measure of Man


Kanon Lad from Superhero Mundane

Can Opinion be Defended?

All of us have our personal quirks, those wonderful character traits that seem so normal to ourselves but allow us to deviate from consensus. One of mine is that, while I enjoy listening to news and sports commentary on the radio, I cannot stand to listen to the “call in” segments where listeners can ask a question or state an opinion. I have no problem when the moderator reads a question from an e-mail. I have no problem listening to the expert’s response to the question. But as soon as the interviewer/moderator says, “Lets take a call”, I have to change the channel or switch the station for a couple of minutes, waiting for the caller to finish before I return to the program.

I’m not sure what that says about me. Nothing good, I’m sure.

Perhaps it is this same inclination that moves me to generally avoid the comments section on websites. With a few exceptions, I find the arguments and dialogues in these sections to be fatuous or unfocused. The retreat into the protection of “my opinion” is particularly frustrating because it seems so common. Even my Appreciation students will fall back into this at the end of the semester where they are confronted with more recent artworks.

What I see are arguments, even on truth statements, where the commenter berates his critics because they question something that they feel is only an opinion. The underlining (or hidden) premise being that judgment, particularly aesthetic judgment, is completely arbitrary.

What I find particularly fascinating is that this view seemingly extends throughout the entirety of societal continuum. Everyone from anti-intellectual pundits, who decry contemporary art as inane products of cultural elitists, to turtle-neck-wearing postmodernists appear to believe that aesthetics is all “in the eyes of the beholder”.

The fake news-site, the Onion, had a particularly fun parody of this phenomenon – a confusing alliance between those who hate art and those who hate “art”.

http://www.theonion.com/content/node/29798

While I recognize that the topic of aesthetic judgment is simply too vast to do justice to here, let me put forward a few thoughts on which I would like to see more discussion.

As further research is conducted in fields such as neurology, biology, and psychology (particularly Evolutionary Psychology, though I recognize its limitations), we are developing a greater understanding of how the mind works and how humans assess atheistic concepts. Interestingly, the research seems to suggest a greater similarity amongst individuals and cultures than we had previously suspected.

Obviously, we need to recognize that, really, there is no such thing as Absolute Truth. This is a concept that is introduced in the Philosophy 101 courses in every school in this country, and then repeated (ad nauseum) by every person who ever took that course. I still have colleagues who remind of this in every conversation – i.e., “Brian, you cannot prove that this table before us actually exists”.

I take this concept as a given. Even in more evidence-based disciplines, science and history for examples, nothing is seen as absolute, unalterable “fact” – everything is based upon a scale of probability. Evidence and reason help determine which theories hold the greatest probability of being true and factual. Of course, there does come a point where the evidence is so overwhelming in support of one theory that to hold a differing view would be irrational.
Could there be a somewhat similar spectrum to aesthetic judgments?

Lets look at an example: Every year, as we get set for the Academy Awards, critics argue about which films were the most important or innovative – in short, which ones are most deserving of recognition. Generally, the group of films that are being discussed is fairly small, perhaps only one or two front-runners. How is it, given the scores of movies that come out every year, that the majority of filmgoers will recognize a relatively small group as being “the best”?

Perhaps these judgments only become clear in the extreme. Critics can argue about whether Orson Welles gave a finer performance in Citizen Kane or Touch of Evil without approaching a conclusion. However, you would be hard pressed to find anyone who would consider Dude, Where’s My Car to be on the same artistic level as either of Welles’ films. This is not say that they might not enjoy Dude, Where’s My Car more than they would Citizen Kane, but few people would see them as equal artistic achievements.

So this tells us something. Perhaps there are some objective underpinnings of our subjective responses.

Most people will not trust their evaluation of a meal if they happen to have a cold at the time. We recognize that certain factors restrict our ability to fully appreciate something – which could lead us to conclude that there are some identifiable (even testable?) determinates of quality.

Now, obviously, I cannot discount how variants in experience can affect evaluation. I believe that part of the reason that I so greatly enjoyed Pan’s Labyrinth was the totality of the experience that I had while watching it. My wife and I had made a special trip to a smaller, art house theater in Atlanta to see it. This was a theater where the picture and audio quality were superb and the audience was experienced in viewing foreign films.

Yet, I feel that, even if I had viewed the film on my iPod, I would have at least recognized the film as well-made and superior to most of the movies that I had seen that year. I imagine that most relatively intelligent people would have drawn the same conclusion, even if they did not enjoy it as much as I did. What I mean is that, regardless of experience, intelligent arguments could be made to support the concept of this film as being one of quality.

Now there are current philosophies that reject the concept of a hierarchy of quality and stress that any conclusion drawn can only be personal and arbitrary.

For now, let’s begin our discussion by examining the premises that I have already introduced. Is “merely stating an opinion” a sufficient argument? Is it possible to establish some form of a “hierarchy of taste” where certain artworks are clearly superior to others?