Thinking


O-031-0437When most people hear “argue” or “argument,” they think of what’s pictured to the left.  I’m not referring to that.  I’m referring to logical discourse.

Argumentation has fallen on hard times in our postmodern age.  Arguments have been replaced by assertions, rhetoric, and sound-bites.  The reasons for this are many: the idea that there are no absolute truths to argue for, a false notion of tolerance, and a pragmatic approach to life to name a few.  We have become more concerned about the utility of an idea than its truthfulness, and our subjective feelings than objective truth.  What I find both interesting and disheartening is that even conservative Christians have disengaged from the art of argumentation.

For many there is an aversion to the very word “argument” because in their mind it connotes fighting.  But there is a difference between being argumentative (a psychological and behavioral disposition), and presenting an argument.  An argument is simply a series of reasons given in support of, or in opposition to some proposition(s).  In this sense the process of argumentation is vital to the epistemological veracity of Christianity.

The process of argumentation and debate aids us in our journey toward more truth.  Argumentation forces us to think of things we might not have thought about before, and only by doing so do we have a chance to grow in knowledge and wisdom.  In his book The Revolt of the Elites Christopher Lasch wrote that it is only in the course of argument that “we come to understand what we know and what we still need to learn,…we come to know our own minds only by explaining ourselves to others.”  The process of argumentation puts our own ideas at risk.  In the words of John Leo, arguments “can rescue us from our own half-formed opinions.”  The opinions that survive the argumentation process demonstrate to both us and our opponents the strength or lack thereof of our ideas.

Arguing with those who hold positions contrary to our own is an act of love because its aim is to rescue people from bad ideas, and bad ideas have bad consequences.  So contrary to those who oppose argumentation because it is unloving, nothing could be more loving.  We actually fail to act in love if we allow someone to hold false beliefs.

experienceI’m sure all of you have heard someone claim that experience, not reason is the best guide for truth.  The best way to expose the fallacy of this sort of thinking is by responding, “That hasn’t been my experience.”  For those who claim feeling, not reason is the best guide for truth, respond by saying, “I don’t get the feeling that that’s true.”

The fact of the matter is that feeling and experience are not sufficient grounds for belief.  Everybody claims to have an experience.  Buddhists have an experience; Jehovah’s Witnesses have an experience; Mormon’s have an experience.  Whose experience is valid?  Who is interpreting their experience correctly?  What do you do when you have competing experiences?  What can serve as an arbiter?  It can’t be another experience, lest we find ourselves arguing in a circle.  We must appeal to something else that is public and objective rather than private and subjective.  That something is reason.  What reasons do we have to believe that the person’s belief-system is correct?  If no appeal to public evidence is provided, no meaningful discussion can transpire.

When we argue for truth based only on our experience, we cut off our own ability to persuade others of our view.  While our experience may be enough to convince us that what we believe is true, the non-believer cannot get at and evaluate our experience, and thus has no way of knowing whether our claims are true or not.  Arguing from experience, then, is a liability in that it prohibits you from being able to persuade anyone else that you are right.  This does not mean experience plays no role, for it does.  It can serve to support and confirm the rational case for our position; it simply cannot constitute the very grounds of our position.

The other day I was thinking about backsliders.  I asked myself, Why is it that people leave God as often as they do?  Most times it is simply because they want to live an immoral lifestyle.  They know the truth, but don’t want to live that truth.  Sometimes, however, it is because they never had a strong intellectual commitment to Christianity as truth.  They were Christians because of their experience, or because Jesus was meeting their needs.  When people are Christians for these reasons, typically they do not remain Christians for long.  When their experience fades, and when Jesus doesn’t meet some felt need they begin to doubt the Christian religion, or their particular denomination’s version of the Christian religion.

This underscores why Jesus said we are to make disciples of all nations.  The life of the mind is of the utmost importance in the Christian religion.  If people are not persuaded that Christianity is true their faith will never be firmly established in Christ.  But when our minds are renewed according to the truth of God it will be impossible to reject the truth of Christianity.  We may choose to ignore the truth so we can follow after our own sinful desires, but we will not be able to walk away from the truth as the truth.

It’s common for those who fail to see the value of learning apologetics to claim that all we need to do is simply present the truth to those in error, and then trust the Lord to convict their hearts.  There is no reason to present people with reasons to believe that what you claim is true is indeed true; we just proclaim the truth and trust God to do the rest.  The unbeliever either sees it, or they don’t.  If they don’t see it, it must be because they don’t love God and have chosen to reject the truth.

What I find interesting about those of this opinion is that they often argue with those of us who see the value of presenting the unbeliever with reasons to believe, claiming that our approach is not based on faith.  “We need to trust in God to change their minds,” they claim.

Did you catch the contradiction?  While they claim to trust in God to convince unbelievers of the truth apart from a rational defense of the truth, they do not trust God to convince us of what they believe to be true: that our rational-support approach to truth is in error.

So the next time an anti-rationalist accuses you of not having enough faith in God to convince unbelievers of the truth, ask him/her “If you have faith in God, why are you confronting me on this issue?  Why are you giving me reasons to ignore reason and ‘just have faith?’  Why didn’t you just pray and trust God to change my mind on this issue?”

They didn’t pray for God to change your mind because they understand that persuasion comes through information, not just spiritual conviction.

Rob Bell, pastor of the influential Mars Hill Church in Michigan, wrote a book entitled Velvet Elvis: Repainting the Christian Faith.  In one section of the book Bell writes:

It is important to remember that we rarely find these first christians trying to prove that the resurrection actually occurred.  To try to prove there was an empty tomb wouldn’t have gotten very far with the average citizen of the roman empire; they had heard it all before. This is why so many passages about the early church deal with possessions and meals and generosity. They understood that people are rarely persuaded by arguments, but more often by experiences. Living, breathing, flesh-and-blood experiences of the resurrection community. They saw it as their responsibility to put Jesus’ message on display. To the outside world, it was less about proving and more about inviting people to experience this community of Jesus followers for themselves.

Mark Oestreicher (who quoted the above passage on his personal blog), president of Youth Specialties, added his own two cents in support of Bell’s comments:

People today could care less about the “proof” of our arguments, the “logic” of our evidence that demands a verdict, or our “cases” for faith, christ, easter, christmas or whatever else. The only evidence demanding a verdict people care about these days is how i live my life. The only case for christ people give a rip about is the case made by commitment to love and justice, or lack thereof.”

It is unfortunate that such high-profile, youth-focused Christian leaders would make such remarks.  While it is true that people are interested in seeing our Christianity lived out in real life—and not just hearing our arguments for Christianity—that does not mean they are unconcerned about our arguments.  Every human being is concerned with the truth because we are made in the image of the one who is Truth.  Knowledge of the truth requires epistemic justification of some sort, and to some degree, and hence arguments are beneficial.

(more…)

In 1941, philosopher Mortimer Adler wrote a short, but impactful article for the Journal of Educational Sociology titled an “Invitation to the Pain of Learning.”  Adler argued that thinking/education is one of the highest and most rewarding pursuits of man; unfortunately, it is also one of the most difficult and painful.  As a result, genuine education is being abandoned for what some have called “infotainment.”  Education has become a passive enterprise, in which teachers provide students with information dumbed down so that it is entertaining, fun, and pragmatic.  But education should be an active enterprise in which we engage ideas and subjects that challenge our mind and shape our character.  Adler calls both people and educational institutions to focus on the short-term pain of educational learning for the ultimate satisfaction of a transformed life.  Here are some great excerpts that are worth your time to read:

One of the reasons why the education given by our schools is so frothy and vapid is that the American people generally – the parent even more than the teacher – wish childhood to be unspoiled by pain. Childhood must be a period of delight, of gay indulgence in impulses. It must be given every avenue for unimpeded expression, …and it must not be made to suffer the impositions of discipline or the exactions of duty, which of course are painful. Childhood must be filled with as much play and as little work as possible. … Heaven forbid that learning should ever take on the character of a serious occupation – just as serious as earning money, and perhaps, much more laborious and painful. … It must all be fun. It must all be entertaining. Adult learning must be made as effortless as possible – painless, devoid of oppressive burdens and of irksome tasks.

[T]he fundamental activity that is involved in every kind of genuine learning is intellectual activity, the activity generally known as thinking. Any learning which takes place without thinking is necessarily of the sort I have called external and additive – learning passively acquired, for which the common name is “information.” Without thinking, the kind of learning which transforms a mind, gives it new insights, enlightens it, deepens understanding, elevates the spirit simply cannot occur.  Anyone who has done any thinking, even a little bit, knows that it is painful. It is hard work – in fact the very hardest that human beings are ever called upon to do. It is fatiguing, not refreshing. … Far from trying to make the whole process painless from beginning to end, we must promise them the pleasure of achievement as a reward to be reached only through travail.

I do not know…whether it [radio and television] can ever do what the best teachers have always done and must now be doing; namely, to present programs which are genuinely educative, as opposed to merely stimulating, in the sense that following them requires the listener to be active not passive, to think rather than remember, and to suffer all the pains of lifting himself up by his own bootstraps.

Unless we acknowledge that every invitation to learning can promise pleasure only as the result of pain, can offer achievement only at the expense of work, all of our invitations to learning, in school and out, whether by books, lectures, or radio and television programs will be as much buncombe as the worst patent medicine advertising, or the campaign pledge to put two chickens in every pot. 

I particularly like what he says about teaching over people’s head.  While this practice is usually condemned, Adler argues it is absolutely essential to good education:

[W]e must have no fears about what is “over the public’s head.” Whoever passes by what is over his head condemns his head to its present low altitude; for nothing can elevate a mind except what is over its head; and that elevation is not accomplished by capillary attraction, but only by the hard work of climbing up the ropes, with sore hands and aching muscles. The school system which caters to the median child, or worse, to the lower half of the class; the lecturer before adults…who talks down to his audience; the radio or television program which tries to hit the lowest common denominator of popular receptivity – all these defeat the prime purpose of education by taking people as they are and leaving them just there.

I couldn’t agree more.  People need to be intellectually challenged if they are ever to grow intellectually.  That’s not to say we should speak in words they do not understand (at least without defining those words for them), or that we do not appeal to their existing knowledge base, but it is to say that we shouldn’t always be covering the ABCs.  It’s appropriate to move on to higher letters in the alphabet.  Christians need to be weaned from theological milk, and learn to eat some theological steak.  Otherwise, they’ll be condemned to being Peter Pan Christians for the rest of their lives.

Last month Amanda Gefter opined in New Scientist that when it comes to explaining the fine-tuning of the universe “It Isn’t as Simple as God vs the Multiverse.”  She was referring to recent comments made by Steven Weinberg and Tim Folger to the effect that cosmic fine-tuning can only be explained by a supernatural cosmic designer or a multiverse.  While Gefter thinks the multiverse hypothesis is a good one, she takes exception with this dichotomy as being unscientific:

There are plenty of reasons to take the multiverse seriously. Three key theories – quantum mechanics, cosmic inflation and string theory – all converge on the idea. But the reason physicists talk about the multiverse as an alternative to God is because it helps explain why the universe is so bio-friendly. From the strength of gravity to the mass of a proton, it’s as if the universe were designed just for us. If, however, there are an infinite number of universes – with physical constants that vary from one to the next – our cosy neighbourhood isn’t only possible, it’s inevitable.  But to suggest that if this theory doesn’t pan out our only other option is a supernatural one is to abandon science itself.

How so?  According to Gefter it is because “science never boils down to a choice between two alternative explanations.  It is always plausible that both are wrong and a third or fourth or fifth will turn out to be correct.”  While I would object to an absolutist interpretation of “never,” in general I would agree that in principle, at least, there could be explanations of the cosmic fine-tuning other than a supernatural creator or the multiverse.

But what might they be?  After all, the reason folks like Weinberg and Folger have reduced the debate to a dichotomy between a supernatural creator and the multiverse is because to-date, no other explanations fit the data.  Gefter postulates that maybe we “endow the universe with certain features by the mere act of observation. … [O]bservers are creating the universe and its entire history right now.  If we in some sense create the universe, it is not surprising that the universe is well suited to us.”  What is surprising, however, is the fact that Gefter entertains this wild and incoherent speculation as a rational, scientific possibility (in her own words, “That’s speculative, but at least it’s science.”).

To say we create the universe through our observation is to say we cause the universe (including its past and present forms) to exist, and to exist in a certain way.  But this is absurd for several reasons:

  1. It would require backward-causation, in which present causes (our observations) produce historical effects.  What philosophical or scientific reason is there to believe this is plausible, yet alone possible?
  2. If our act of observation is the sufficient cause of the universe’s existence, then prior to our observation (the cause) there was no universe (effect).  If there was no universe, what were we observing?  Nothing.  If there was nothing to observe, there was no effect to affect.
  3. Where did observers come from?  If, for observers to exist, the universe must be finely-tuned to produce them, then the universe must precede its observers both causally, logically, and temporally.  If a finely-tuned universe must precede its observers, then it is the cause of us-we are not the cause of it.
  4. If observers cause the universe to exist, and the universe in turn causes observers to exist, then we must exist prior to existing, which is incoherent.
  5. If observers endow the universe with certain features by the act of observation, and observers observe different (and sometimes conflicting) things, why isn’t the universe endowed with different laws, and a different history simultaneously?  Why doesn’t the universe have an eternal past when observed by a proponent of the Steady-State model, and a finite past when observed by a proponent of the Big Bang model?  If the universe is a real existent, it cannot be both eternal and past-finite simultaneously.  One of the observers must be mistaken.  If that observer cannot alter reality by his observation, then it follows that our observing the universe has no causal relationship to the universe.

Far from demonstrating the inadequacy of the creator-multiverse dichotomy, Gefter confirms it.  If the dichotomy can only be avoided by postulating something so absurd as the notion that we create the universe by observing it, surely it is more rational to stick with the dichotomy.

HT: Colliding Universes

Which sounds more appropriate?:

1.  My opinion is that vanilla ice-cream is the best flavor of ice-cream.
2.  My conviction is that vanilla ice-cream is the best flavor of ice-cream.

I think most people would go with option 1, but why?  The denotative meaning of “opinion” and “conviction” allows for both usages, but the connotative meaning is quite different.  “Opinion” connotes a weak epistemic viewpoint.  When someone says they have an opinion on a matter, we tend to think there was little, if any research that went into forming their viewpoint.  “Opinion” has subjectivity and personal taste written all over it.  “Conviction,” on the other hand, connotes a much stronger epistemic viewpoint.  When we hear someone say their conviction is that X is true, we tend to think there was at least a fair amount of research that was instrumental in forming their conclusion.  A conviction is not entirely subjective, but based in some facts.

I would not make a doctrine out of this, but it seems to me that when we are speaking of our perspective on matters of objective truth, that we couch them in terms of our “conviction” rather than in terms of our “opinion.”  Opinion seems better reserved for matters of subjective truth like one’s favorite flavor of ice-cream.  Conviction bespeaks rational persuasion.  This is important in a culture in which religious claims are presumed to be flavors of ice-cream, with everyone simply picking the flavor that appeals to their tastes.  We need to make it clear that we do not have mere opinions on religious matters, but have developed genuine convictions through researching matters of objective fact.

Here is my thought for the day: You can generally judge the depth of a thinker and the value of his/her thoughts by how familiar s/he is with the thoughts of others. 

There is a difference between a person who has formed ideas, and a person who has formed ideas in the context of other thinkers’ ideas.  Generally speaking, those who are ignorant of the insights and developments of others in the past and present, have a very limited, and often skewed perspective.  They are likely to miss the big picture, repeat the mistakes of others in the past, or fail to account for something simply because they have yet to consider it. 

Those who form their thoughts in a vacuum from other thinkers tend toward error.  I often hear preachers preface some remark by saying, “I didn’t get this from no man.  The Lord revealed to me straight from the Good Book.”  Whenever I hear that, I know chances are that what I’m about to hear is probably off-base.  And it usually is.  As Charles Spurgeon said, “It seems odd, that certain men who talk so much of what the Holy Spirit reveals to themselves, should think so little of what he has revealed to others.”  Those who try to reinvent the wheel without knowledge of past wheel makers, never do a very good job at it. 

Those who have conversed with the thinkers of today and yesteryear, however, will tend to have a much better, more informed perspective.  They tend to be more balanced, and aware of their intellectual limitations.  When I read something someone wrote on the topic of theology or philosophy, I’m looking at the footnotes to see what sources, if any, the author has used.  It’s usually a good indication of the quality of work I’m about to read.

When it comes to theology and philosophy, we would be stupid not to pay attention to what others have said before us.  It is the epitome of hubris to think others (particularly those in the past) have little or nothing to offer us.  99.99% of what we know is inherited from the intellectual labor of those who came before us.  If we ignore them, we are only left with 0.01% of true knowledge.  Woe to us if we attempt to think in an intellectual vacuum.

Some concepts are so heady that they are difficult to put into words.  For example, how does one talk about what God was doing before creation, when creation marks the beginning of temporality?  There cannot be a “before” the beginning, and yet we can conceive of God’s existence before time began.  While it is difficult to put this into words, one way of doing so is to speak of God existing “without the universe.”  Problem solved. 

There are other concepts, however, which are impossible to put them into words.  Consider “nothing.”  It is impossible for us to even imagine nothingness, yet alone to reduce it to words.  For example, according to the Big Bang theory of cosmic origins our universe came into existence from literally nothing about 13.7 billion years ago.  But to say our universe came into existence “from” nothing treats nothing as if it were somewhere from which the universe emerged.  It isn’t, and that’s not what scientists mean to say, but that is the picture that emerges when we try reducing this concept to words.  

We might even conceive of a “time when nothing existed,” but this too is unintelligible.  There cannot be a “time” when nothing existed, because a state of nothingness includes the absence of temporality.  I’ve done it again.  I referred to nothingness as a “state,” but it is not a state.  It is nothing!  The fact of the matter is that no matter how we choose to refer to the concept, the moment we do so, we reify it in the process.  We do the same thing for other concepts.  Consider darkness.  Darkness, as such, does not exist.  It is the absence of all light.  And yet the moment we refer to “darkness,” we reify it, as if it were an existent.  This is a shortcoming of language we have to live with, but we need to be cognizant of the fact that speaking of nothingness, or of something coming from nothing, does not mean nothing is something.  It means no-thing.

Renee Descartes was the first modern philosopher. He was a rationalist. His goal was to ground knowledge in something that could not be doubted. He found such a grounding in his famous formulation, Cogito, ergo sum (I think, therefore I am). The question to be answered was how he could know he existed. The answer was to be found in his act of contemplation of the very question. To contemplate existence requires a contemplator who exists. That he was thinking about doubt was something he could not rationally doubt, and thus concluded he knows indubitably that he exists. He reasoned deductively as follows:

P1 The act of thinking requires the existence of a thinker
P2 I experience the act of thinking
______________________________________________
I exist as a thinker

Some argue that Descartes key insight actually turns out to depend on a logical fallacy: begging the question. The question is whether there exists a personal subject, “I.” And yet “I” is smuggled into the second premise of the argument. That is question-begging, for it assumes there is an I to experience the act of thinking, and then concludes that there is an I who thinks. I am conflicted about this. On the one hand, this seems reasonable to me. Descartes reasoning does seem to beg the question. On the other hand, Descartes argument seems valid: the ability to contemplate one’s existence requires that they exist. What do you think?

If Descartes did beg the question, invalidating his argument, then it seems there is no non-question-begging argument that could indubitably prove I exist. Of course, this does not mean I do not exist. I do, and I know I do. It simply means we can’t demonstrate how we know this, other than an appeal to basic intuition.

I think this is a helpful lesson for skeptics. One does not need to be able to prove (or know how) X is true in order to know X is true. Some truths are properly basic; i.e. they are self-evident, do not need to be questioned, and do not need evidential demonstration.

 

UPDATE 3/1/17: Perhaps the supposed question-begging nature of the argument is merely the fault of how analytic philosophers structure the argument. For example, if we state the argument as follows, it does not beg the question:

P1 The act of thinking requires the existence of a thinker
P2 There are acts of thinking ______________________________________________
Thinkers must exist

It is often said that science is incapable of arriving at certainty because science is based on inductive reasoning, and the conclusions we come to using inductive reasoning are probabilistic, not certain. For example, I could reason that since every crow I have ever seen has been black, that all crows are black. This is probable given our observations, but this conclusion is not certain because it is possible that there are crows of a different color that we have not yet observed.

The history of science has demonstrated just how fallible inductive reasoning is. Many scientific conclusions have proven to be wrong as new data comes to light. For this reason, science is incapable of speaking with any level of certainty to the question of God’s existence. Inductive reasoning simply cannot tell us anything conclusive about God’s existence.

Philosophy, on the other hand, works primarily on deductive reasoning, and deductive reasoning does provide us with certain conclusions. For example, I might reason that

Premise 1 Socrates is a man

Premise 2 All men are mortal

Conclusion Therefore Socrates is mortal.

The conclusion is absolutely certain. Sound philosophical arguments for God’s existence, then, can provide us with certainty about God’s existence. But do they?

While the conclusion of a valid deductive argument is certain, we can only be as certain of the conclusion as we are certain of the premises that support the conclusion. It turns out that the premises in a deductive argument are themselves derived from inductive reasoning or experience, both of which can be mistaken. In other words, deductive arguments provide sure conclusions to probable premises. In the example above, premise two is an inductive conclusion based on our experience with other humans. We have observed that all human beings are mortal, and thus conclude that all human beings are mortal. But it could be the case that there are humans who are immortal that we do not know about. Maybe they live on other planets or in another realm of reality. Granted, the chances of this are slim, but we cannot be certain. The degree to which we can be certain that Socrates is mortal, then, is the degree to which we have reason to believe all men are mortal.

So contrary to popular conception, deductive arguments do not provide certain knowledge. They may provide us with more assurance than inductive arguments, but no argument can provide us with certain knowledge. All knowledge is probabilistic in one sense or the other. While philosophical proofs for God’s existence are vastly superior to inductive proofs, neither can provide us with certainty on this important question.

I take the train to work. Most days, there is a man playing the saxophone near the staircase at the depot. He always displays some sort of message to the passerbys, usually of a religious or philosophical bent, and not always profound. Today’s message struck me as particularly dumb: For every question, personal experience is the final test of truth.”

Really? I wanted to ask him if he had ever been to Naples. I would expect a negative answer, at which time I would respond, “So I assume you don’t believe Naples exists, then, right?” If personal experience is the final test for truth, and we have not experienced it, then it cannot be true. In my own life, a whole host of things cannot be true: murder, wealth, and the like. Very dumb. And yet, empiricism is how many people go about determining truth.

Philosopher Alexander Pruss has been sharing some “Deep Thoughts” over at his blog. Here were my favorites:

The past is over.
The future has not yet come.
What was will never not have been.
Impossible things don’t happen.

Don’t you love tautologies?

“If our ideas are easily destroyed by those acquainted with the facts, they ought to be discarded. But if our ideas are good, they will not be upended so easily. … Developing answers to the toughest arguments against our position strengthens both our witness and our confidence in our convictions.” – Greg Koukl, May 2008 introductory letter to the May/June 2008 issue of Solid Ground

“Ignorance may be bliss, but it is not a virtue.” –D.A. Carson, Exegetical Fallacies, p. 23

The law of non-contradiction (LNC) states that A cannot be both A and not A at the same time and in the same way. For example, my car cannot be said to be both in the garage and not in the garage at the same time and in the same way. It could only be both in the garage and not in the garage at the same time if by being “in the garage” in the first instance means something different than it does in the second. For example, it would not be a contradiction if in the first instance I mean to refer to the body shop where my car is being repaired, and in the second instance I mean to refer to its normal storage space where it is currently absent.


 

Postmodern types disparage the LNC (as with all laws of logic) as a Western invention. No argument is made for such a claim. It is just asserted (any argument offered against the LNC would require them to presuppose its truth, because the premises and conclusion of the argument are not the same as their negation). I have a sneaking suspicion I know why they want to axe the LNC: their worldview is inherently self-contradictory.

 

Postmodernism claims there is no truth, or that truth cannot be known. And yet, this is a contradiction because the claim that there is no truth, or that truth cannot be known is itself a claim to know something that is true. If the LNC is true, then postmodernism is false. The LNC must be axed to save postmodernism as a worldview.


 

When you point out the self-referential and incoherent nature of postmodernism, the postmodernist will retort that such an analysis depends on the LNC. Since the LNC is a Western invention, it is inappropriate to subject postmodernism to its criterion. In fact, doing so is just a power play to subjugate others.


 

What can you say to those who deny the LNC? Greg Koukl has offered a good strategy. When someone claims the LNC is not true, but an invention of Western logic, respond, “So what you are saying, then, is that the LNC is true?” They will protest, “No, I am saying it is not true.” We might respond, “Oh, so you are saying the law of contradiction is true, then. Thank you for clarifying.” Frustratingly they will reply, “No, no. That is not what I am claiming. I am claiming the LNC is not true.” We might graciously answer, “Exactly. That is what I said you said: The LNC is true.”

 

I would venture to say they would be exasperated with you by this point; aggravated that you would contradict But this exposes the very problem with their claim that the LNC is a Western convention, rather than a universal and necessary feature of human rationality. While they deny the LNC with their lips, they cannot help but to recognize that “is” and “is not” are contradictory, and thus your restatement of their view contradicts their stated view. That is inescapably self-refuting. They cannot deny the existence of contradictions on the one hand, and then correct your contradiction on the other.

 

 

For a person who truly believes the LNC is a fiction of Western logic, the only appropriate response to your restatement is a confirmatory, “Yes.” But no one would respond in this way. He would initially seek to correct your contradiction, assuming you have misunderstood him. Even if one dared to respond in this way, I would venture to say he does not believe that which he speaks. For if he believed it, he would have to acknowledge that there was a difference between his believing it, and not believing it. And if such a difference exists, the LNC must be true. The LNC is a first principle of thought that cannot be avoided (rational intuition). It is universal and necessary to all human reasoning—even for those who seek to deny it.


 

Of course, there are other more persuasive ways of illustrating this truth that guarantee your postmodern friend will come to acknowledge the truth of the LNC. The early 11th century Medieval Muslim philosopher, Avicenna, devised an infamous way for helping someone see the irrationality involved in denying the LNC. Avicenna wrote, “Anyone who denies the LNC should be beaten and burned until he admits that to be beaten is not the same as not to be beaten, and to be burned is not the same as not to be burned” (Metaphysics 1). By no means would I suggest using this tactic, but if this wouldn’t convince your postmodern friend of his error, nothing can!

Just for fun…. There are age-old questions that circulate from generation to generation, supposedly unanswerable. Surprisingly enough, most of these questions are far from being an intellectual enigma. They can be answered, and answered quite easily at that. Here are a few:


If a tree falls in the forest, and there is no one present to hear it, does it make a sound? Of course! Sound waves are produced whether there is anyone present to receive them or not.


Did Adam and Eve have belly buttons? No. Belly buttons are scars from the umbilical cord attached to us during our prenatal stage of life. Adam and Eve were created as adults. Since they never experienced the prenatal stage of life, they never had an umbilical cord attached to their bodies, and thus they never would have developed a belly button.


Where did Cain get his wife? It was his sister! Yes, I know, that’s gross. But it’s true nonetheless.


Which came first—the chicken or the egg? The chicken! For the egg to produce a chicken it would have to be a chicken egg, meaning it would have to contain the genetic blueprint for building a chicken. But where would such a blueprint come from if not a chicken? Without a prior chicken, there could be no egg capable of producing a chicken. The chicken would have to exist as a species before it could reproduce itself.


Why did the chicken cross the road? To lay the first egg. See above.


If you are inside a falling elevator, if you jump in the air before the elevator hits the ground, will you escape injury? No. While inside the elevator your body is traveling at the same speed as the elevator. If the elevator is falling toward the ground at a rate of 15 mph, your body is also falling toward the ground at a rate of 15 mph (even though it still relative to the elevator). Jumping in the air will only slightly delay your impact into the ground at 15 mph, and only slightly delay your injuries! If you are not convinced, think what would happen if you were sitting on the hood of a car going 15 mph, and the car suddenly slammed on its brakes. Your body would continue to travel at 15 mph, catapulting you from the hood of the car to the gravel on the road. The same principle applies with the elevator.


So there you have it. Can you think of any other examples?

It’s common to hear people say “I do not expect to change your mind” in the course of debate these days. Just recently I was debating someone on an exegetical issue involving 1Thessalonians 4:14 who said these very words to me after only one round of correspondence.

While there are instances in which this assessment is justified–such as when your opponent declares, “Nothing you say is going to change my mind,” or when, after a sufficient amount of dialogue it becomes clear that your opponent suffers from intellectual stubbornness–it is often used prematurely and inappropriately. I would advise dispensing with such talk for two reasons.

First, I think it communicates a defeatist message, and that prematurely. It may be that neither individual will change his position as a result of the debate (although they often cede various points), but one should let the debate run its course before concluding that their arguments failed to persuade their opponent.

Secondly, and m
ore importantly, the comment is demeaning to either oneself, or one’s opponent. It can be self-demeaning in that it cedes the lack of cogency in one’s argument a priori. How can we be so sure our arguments will not persuade our opponent? If we do not think they are persuasive, why even offer them?

More often, however, such a comment is meant to demean your opponent. It communicates the idea that you don’t think he possesses enough intellectual honesty to change his position in light of the evidence you are presenting. That is very demeaning.

Whether we mean to demean the quality of our arguments, or the intellectual honesty of our opponents, such a statement is demeaning and should be used wisely and infrequently.

And for the record, I do expect my arguments for a limited use of this comment to change your mind! And so should I. If our arguments are good ones, none of us should expect any less.

What relationship does rationality have to faith? While some only convert after they have examined the evidence for Christianity, most people convert based on a personal experience with Jesus Christ. That’s the way it was for me. I came to believe Christianity was true, not by a rational examination of the evidence, but because of my personal encounter with the risen Christ. I remain a Christian, however, not only because of my past and present experience, but because I have examined the rational evidence for Christianity and found it superior to all other worldviews.


Whether one first believes because of what they know by experience, or what they know by rationality, the fact remains that a robust faith requires both. He who first believes based on an experience needs to supplement that experience with a rational inquiry of the faith they now hold. He who first believes based on a rational examination of Christianity needs to supplement his persuasion with a personal encounter of Jesus Christ.


For further reading about the relationship of faith and rationality, see my articles on the topic at IBS:


Faith Has Its Reasons

What is the Relationship of Reason to Revelation?
A Balanced Perspective on Reason and Faith

Investigating Faith: Placing Religious Truth Back Into the Arena of Knowledge

Religious Truth Can Be Known
Scaling the Gulf Between Scientific and Religious Knowledge

« Previous PageNext Page »