Higher-order thought theory, a kind of revised version of representationalism, is a relatively new theory of consciousness to rise in the philosophical world. It posits that a certain kind of attention is required for certain mental states to become phenomenally conscious. In this essay, I will both explain Rosenthal’s explanation of his higher-order thought theory along with my primary reasons for questioning his proposal.
First, Rosenthal clarifies exactly what he means when he speaks about consciousness. He is not speaking of creature consciousness, for he believes “we know in general terms what it is for a creature to be conscious; it is conscious if it is awake and sentient” (Rosenthal). This question, which is essentially the easy question of consciousness, is more or less uncontroversial. What Rosenthal is more interested in is the hard question of consciousness: the what-it’s-likeness of experience. He is curious about how we can explain the phenomenology of experience using a reduction to purely physical properties.
Firstly, Rosenthal notes that we are never conscious of thoughts that we are not aware of. He appeals to experiences like when we suddenly become aware that we are angry despite being angry for some time. Sometimes we are not conscious of a thought or emotion until we became aware of it. Our senses also perceive so much about the world, but we are not conscious of a vast majority of it. “Subliminal perception and peripheral vision remind us that perceptual sensations can occur without our being aware of them” (Rosenthal). For this reason, unlike purely representational theories, he incorporates a sense of awareness into his theory of mind. Where the cartesian view of mind would claim that every mental state is conscious, Rosenthal finds this impossible to believe for the aforementioned reasons. He thinks that a conscious experience requires something more.
This something more is what he calls higher-order thoughts (HOTs). He declares that there are two distinct kinds of mental states: first-order transitive states and higher order intransitive states. Rosenthal describes that “one is transitively conscious of something if one is in a mental state whose content pertains to that thing” (Rosenthal). Essentially, being transitively conscious of something is to have a mental state that in some way assertorically represents that thing. One thing to note is that the word “conscious” here does not mean phenomenally conscious (a common trend in philosophy where one word means many contradictory things). Here, Rosenthal means that a transitive mental state is “a thought about the thing, or a sensation of it,” but “that mental state need not be a conscious state” (Rosenthal). To be conscious of something, he believes you must be transitively conscious about a mental state: a type of thought that he calls intransitive or a HOT. So, for a thought or experience to be conscious, there necessarily must be a HOT about that thought or experience. “Plainly, if a state is a conscious state, we are transitively conscious of it” (Rosenthal).
Rosenthal poses there is a sort of HOT hierarchy in our minds. The base of the hierarchy is these first-order transitive states about the world and our experience, and every other thought is a HOT that is introspectively aware of another thought. Some thought, HOT-n, is conscious if and only if there exists a thought, HOT-n+1, that is conscious of it. This is where my biggest qualms with Rosenthal’s higher-order thought theory lie.
Firstly, his conception of first-order thoughts is overly simplistic and not actually representative of how our brains function. For example, consider the case of peripheral vision that Rosenthal speaks about. He claims that our peripheral vision is a first-order transitive thought about our perception of the world but that it is not actually a phenomenal experience unless we are aware of what is going on in our periphery. My question is why Rosenthal can neatly declare vision as the first-order thought when there is a substantial amount of computation that occurs before this thought. Consider that there is a black cat in your periphery that you are not conscious of. There are parts of your brain that take the raw data from your eyes and try to detect edges. Then that information is sent to other parts to be combined into shapes. Next, those shapes are sent to other parts of the brain that use more or less bayesian reasoning to classify the shape as most likely a cat. In this case, there already exists a hierarchy of thoughts about each other, in which case most of these would necessarily be conscious, but we are evidently not conscious of edge or shape detection. Rosenthal declares the experience of seeing a cat as a first-order thought, but it would make far more sense to declare the first-order thought as the raw data coming from your eyes into your brain. It seems to me that Rosenthal conveniently declares an arbitrary level as first-order simply because that is the first level that we have ever experienced as consciousness, but it didn’t necessarily have to be this way. It is feasible to conceive of a phenomenal experience where these low-level perceptual modules are conscious, so Rosenthal owes us an explanation as to why they aren’t.
In response to this criticism, Rosenthal would probably declare that only certain kinds of mental states count as transitive states. Just as only certain kinds of thoughts count as HOTs, he would most likely say that a transitive state is more than simply having data about the world, that it takes a certain kind of assertoric representation of the world to be sufficient for consciousness. If this is the stance that Rosenthal would take, then that leads me to my second problem with Rosenthal’s theory: that he is just a representationalist with extra jargon.
Rosenthal believes in a non-relational theory of HOT. This means that it is the HOT that does all of the heavy lifting of consciousness and that there doesn’t even necessarily need to be a thought for it to be pointing at for it to be conscious. Essentially, instead of a HOT “pointing” at a lower-order thought, the HOT actually contains the content of the lower-order thought. This is how he explains instances where there is a mismatch between a first-order thought (i.e., feeling a throbbing pain) and conscious experience (i.e., experiencing a vague “pain”) (Rosenthal). This explanation would be consistent with having perceptual non-conscious mental states and hallucinating unreal experiences. But if it is the case that there is no actual hierarchy of thoughts pointing at each other and instead it is only the HOT that contains all of the information for consciousness, that makes Rosenthal just a strong representationalist. If HOTs are not relational, then they are just about a special kind of representing a thought or experience in a certain way, in which case Rosenthal would just be arguing that representing something in a certain way is sufficient for consciousness. He might respond that his theory is different because of its inherent focus on awareness–something that representationalists don’t necessarily consider. However, if Rosenthal intends to reduce awareness to a physical process, then having an awareness of a thought is necessarily the same as representing that thought in a certain way.
Furthermore, Rosenthal holds that there is an inherent difference between first-order and higher-order thoughts. He believes that when a first-order thought like sight or pain is conscious, there is something it is like to experience that. However, he also believes that there is nothing that it is like to experience a thought or experience that is not perceptual in nature. Although many philosophers would argue that there is some degree of phenomenology to being conscious of an intransitive thought, Rosenthal is not in this group of philosophers. But if this is the case, then Rosenthal is declaring that there is an inherent difference between qualitative first-order experiences and non-qualitative higher-order thoughts. Rosenthal does not explain this difference, and its existence indicates that his theory does not get us any closer to a reductive theory of consciousness that explains the hard problem.
Rosenthal has many interesting ideas about how we can make sense of thought structures in the mind, but I am unconvinced that his theory actually adds anything substantially new to the table or explains why certain things feel the way they do.
Brown, Richard. Consciousness, Higher-Order Theories of, 2019, doi:10.4324/9780415249126-V051-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/thematic/consciousness-higher-order-theories-of/v-1.
Rosenthal, David. "A Theory of Consciousness." Journal of Philosophy, vol. 83, no. 7, 1986, pp. 403-429.