Battle for the soul – the God question and epistemology, or “why do you believe what you believe”?

Let me begin this article with a disclaimer: this post is somewhat tangential to the main theme of this blog (although I do think there are lots of interesting connections worth talking about between religious belief and Bayesianism. Disclaimer 2: I stayed up later than I planned writing this, and I didn’t really finish moulding it into a smoothly flowing essay. I am now too tired to continue and don’t care anymore, but I hope it is at least a little interesting for those of you who might read it. Please forgive its sloppiness and half-finished character :p.)

No, instead I am more interested in the particular kind of inner conflict that is waged in the soul of anyone who has struggled with the question of whether or not God exists, or with more general questions about the nature of reality, or even with more mundane things like whether Santa Claus exists. The conflict I refer to is the battle to achieve a self-consistent epistemology, that is, a self-consistent theory of knowledge. You may not have ever called it by such names, but if you have ever encountered a proposition which fundamentally challenged your belief system, if you have undergone a mental battle to either reject that proposition or else modify your belief system, then it is quite probable your brain has also struggled to decide whether it is indeed judging the validity of propositions by the right criteria; such criteria effectively form your own internal theory of knowledge.

There is a large subconscious element to this struggle, I believe. For example, I like to think I know quite a lot about fundamental physics, and thus about what sorts of things are and are not possible in this universe, yet I still on occasional feel a chill down my spine working late at night in my big empty open-plan office, as if my subconcious mind has not yet entirely dismissed the possibility of paranormal phenomenon, no matter how vehemently my conscious mind argues that such notions are ridiculous.

But I digress. I have an intuition that, for most of us, our internal theory of knowledge solidifies when we are quite young; this is why, for instance, some of us are happy to accept the possibility of transcendental sources of knowledge (such as being spoken to by a God or gods, or perhaps ‘feeling’ his/their presence in our lives), and why others of us deny the validity of this. I perhaps superficially fall into the ‘no transcendental knowledge’ camp; however, I consider the question of why I believe this to be of the utmost seriousness, the entire foundations of ones views on reality being too important to leave to the random events of one’s particular childhood.

If one is serious about this examination, it seems to me that one should rapidly approach the following question: why believe anything? What in fact justifies a person to any degree of belief in one proposition or another? Even our (my?) most sacred scientific values are not immune to such epistemological nihilism. If one cannot convince oneself of at minimum a partial answer to this question, then all knowledge must be denied; even the words I type right now should be denied meaning.

This blog being somewhat about Bayesianism and statistics you might think I am about to argue that those should form the foundations on which all human knowledge is built. And to some sense I do think this, but the question of the preceding paragraph undercuts the foundations even of these. So one must first have some reason to accept these as a foundation for knowledge.

In fact there is no real satisfictory solution to this kind of extreme deconstruction of knowledge, as far as I am aware. It is a “catch 22” – no argument can argue for its own logical foundations in such a completely self-contained way. If I was a logician I expect something deep could be said about Gödel’s theorems at this point.

So, we must work from axioms and build a theory of logic, and, subsequently, of knowledge. But what should our axioms be? And could we choose other ones? By what process should we accept some set of axioms? We appear to be again screwed. We seem to have nothing but our intuition with which to produce said axioms, and no reason to believe that our intuition will produce for us the “correct” axioms, if such things exist. If we do work from some axioms (or the intuitions to which they correspond) it is easy to imagine we might convince ourself of the validity of our axioms, but such reasoning would be circular, and my own axioms (at least) lead to the rejection of such circular logic. Some philosophers seem to argue that while perceptions are fallible we have some primary core intuitions which are trustworthy, and that these give us enough foundation to perform logical deduction and so build ourselves a valid system of logic, but I really don’t see how this can be the case, or rather I don’t see how we can know if we have achieved what we think we have achieved. If a demon (or machine superintelligence) could potentially control our perceptions, what prevents them controlling our thoughts and intuitions? Similarly if our perceptions of the external world are warped by psychosis, why not too our beliefs about logical deduction? When we dream we find it very easy to accept the fantastic as plausible, and I don’t consider it a stretch that the mind could similarly be tricked into thinking it is performing logical deduction when it is in fact doing nothing of the sort.

But let us suppose that we can indeed build some logical system. Can transcendental knowledge be allowed in a self-consistent system? I have no argument for why it cannot; my only argument against it is that is seems extremely dangerous and vulnerable to self-deception. How does one tell the difference between information from a transcendental source, and illusion?

Perhaps a counter-argument would be that all intuition is subject to such self-deception. I would be inclined to agree, but rather than using this argument to accept more intuitions it instead drives me to reject as much of my own intuition as is possible. Not in a pragmatic sense: of course our intuitions are well trained for many pedestrian matters. But when it comes to questions of fundamental significance, I cannot see any valid way to reason about them from intuition or feelings. I admit that actually adhering to such a policy is much more difficult, given that we need to begin from axioms…

Before I finish this somewhat rambling post, let me ask any readers I might have a question: what do you know about Descartes’ “evil Demon”? (I alluded to this earlier: – This is essentially the same as what we might call “the Matrix problem”: how do we know that all our perceptions of the supposedly external world are not controlled by an “evil Demon”, or machine superintelligence, etc?) I never read Descartes so I don’t actually know what he had to say on the matter. It seems that he didn’t think that such a hypothesis was plausible, but I don’t know why, especially since, as I understand it, he goes on to postulate the existence of an omnipotent benevolent creator. What logic leads him to one and not the other I don’t know.


4 thoughts on “Battle for the soul – the God question and epistemology, or “why do you believe what you believe”?

  1. Yep, Gödel defintely talked about all that stuff. As far as I know, there is no way around any of this stuff except to have meta-meta-etc-things.

    I’d argue that intuition is important in all matters, provided that you have the right intuition – and realise when you don’t. You can’t begin to probe anything without having some intuition about where to start. Although I was just using this argument last week to justify to my real analysis students why they have to prove all the stuff that they can see is “obviously true” – their R^3 based intuition breaks down as soon as soon as they go to infinite dimensions.

    • I would only agree that intuition is only important for “speed”; that is, once we have convinced ourselves that some intuition or another is logically well-founded, then we can just accept it and use it to justify further propositions in the future, without having to redo the rigor. And yes, that in this process it is important to simultaneously train some intuition about when the first intuition has the potential to fail, so that we can realise that we need to go back to the fundamentals at that point. Some theory of mind people call this “system 1” vs “system 2” reasoning I believe, although I think it is also accepted that the brain doesn’t actually divide intuitive and deliberate reasoning quite so cleanly as that model suggests.

      But that is all a matter of pragmatism. The real question, I think, is how we justify our foundational intuitions. Math is a perhaps a uniquely unusual field in this respect: there axioms are constructed by intuition, but do not rely on any intuition for their logical validity. We can build towers upon towers of connected propositions with (almost) no fear that their foundations will fail. But this is only because we have ourselves defined what logic is; we are not guaranteed that this logic has anything to do with the true nature of reality. I intuitively think that it must, but this is one of those intuitions which I have no way of knowing if it is “right”.

      • Ah yes, I think you’ve mentioned this “system 1 vs. system 2” thing before. Of course you can’t rely on intuition for rigour; I was just saying that the best intuition gives the best starting point.

        I’m not sure exactly what you mean by foundational intuitions then – do you mean what we accept as truths that we’re trying to model? If so, this is where my philosophy of science is different. I’ve always thought that we can never know any truths outside of “this is true within the framework we are working”. We may get better and better at predicting things, but the models with which we we used to predict seem to be just that – models. Maybe I just suck at philosophy though 😛

        Also, this “what is logic?” stuff was something that Goedel spent a fair bit of time on too. You should dig out some of his stuff.

      • Ahh, yes I guess I was not very precise about that. What I mean is those intuitions which cannot be justified by logical deduction based on some more primitive intuitions. The idea being then that the only way to justify those intuitions, to claim you ‘know’ them to be true, is either by some appeal to a transcendental origin, or to some other sort of a-priori knowledge. So I didn’t talk about it in the post, but I think all this comes down to a question about what is valid a-priori knowledge and where it comes from.

        There was a good essay by Bertrand Russell I read at some point where he talks about this, but I sadly can’t find it now. I did find this essay (well, part of one of his books) which partially deals with the topic though: About halfway in he makes the following statement which is quite relevant:

        “Apart from minor grounds on which Kant’s philosophy may be criticized, there is one main objection which seems fatal to any attempt to deal with the problem of a priori knowledge by his method. The thing to be accounted for is our certainty that the facts must always conform to logic and arithmetic. To say that logic and arithmetic are contributed by us does not account for this. Our nature is as much a fact of the existing world as anything, and there can be no certainty that it will remain constant. It might happen, if Kant is right, that to-morrow our nature would so change as to make two and two become five. This possibility seems never to have occurred to him, yet it is one which utterly destroys the certainty and universality which he is anxious to vindicate for arithmetical propositions.”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s