Sunday, February 23, 2014

Reading Comprehension as Cognitive-Based Processing



The importance of schema theory to reading comprehension lies in how the reader uses schemata. I don’t think this issue has yet been resolved by research, although it seems to show that some mechanism activates just those schemata most relevant to the reader's task.

There are several models based on cognitive processing.  For example, the LaBerge-Samuels Model of Automatic Information Processing (1974) emphasizes internal aspects of attention are crucial to comprehension.  This model is foundational to the theory of  fluency. 

A beginning reader can only pay attention to decoding when they are in the early stages of reading.  As the beginning reader becomes more proficient with word recognition they begin to switch between decoding and comprehension, paying attention to both, but neither are automatic, so comprehension suffers.  Finally, the automatic stage of Automatic Information Processing results when a reader recognizes words without attention and comprehension exists.   

The LaBerge-Samuels model suggests 3 characteristics of attention: 

*alterness: which is the reader's active attempt to access relevant schemata involving letter-sound relationships, syntactic knowledge, and word meanings.

*selectivity: referring to the reader's ability to attend selectively to only that information requiring processing.

*limited capacity: which is the fact that our human brain has a limited amount of cognitive energy available for use in processing information. In other words, if a reader's cognitive energy is focused on decoding and attention cannot be directed at integrating, relating, and combining the meanings of the words decoded, then comprehension will suffer.

"Automaticity in information processing, then, simply means that information is processed with little attention" (Samuels, 1994).  This results in comprehension difficulties  because the reader cannot rapidly and automatically access the concepts and knowledge stored in the schemata.

Another example of a cognitive-based model is Rumelhart's (1994) Interactive Model. Information from several knowledge sources (schemata for letter-sound relationships, word meanings, syntactic relationships, event sequences, and so forth) are considered simultaneously. The implication is that when information from one source, such as word recognition, is deficient, the reader will rely on information from another source, for example, context clues or previous experience.

Stanovich (1980) explains that when there is a deficiency in word recognition, interactive-compensatory processing becomes necessary because the reader (any reader) compensates for deficiencies in one or more of the knowledge sources by using information from remaining knowledge sources. Those sources that are more concerned with concepts and semantic relationships are termed higher- level stimuli; sources dealing with the print itself, that is phonics, sight words, and other word-attack skills, are termed lower level stimuli.  In Stanovich’s interactive-compensatory model, the reader will rely on higher-level processes when lower-level processes are inadequate, and vice versa.


These cognitive based processing models of reading support my thinking that teachers need to use a variety of approaches when teaching reading. Studies conducted by Chomsky (1978) and LaBerge (1979) indicate that striving readers made greater gains in comprehension and reading speed when attention was paid to fluency and automaticity.  To increase fluency, reader confidence and motivation we should provide our students with the practice and strategies they require to help them become automatic readers. 

3 comments:

  1. So I really didn't read yours until now! Wow - either we are on the same wavelength or we share a brain - Lollll. Nice job!

    ReplyDelete
  2. Nice explication of the theories Paulina! I'd like you in future posts to make connections to your own research/work as well. Consider some hyperlinks/visuals too.

    ReplyDelete
  3. I love your blog about automaticity theory. I totally agree that we need to consider fostering this in our students, even high school students. I know in my Strategies classes we do automaticity work with them. We have an independent reading program, which is one way we try to accomplish this with our struggling readers, but we also do a lot of fluency work with them in the specific disciplines, which they tend to be less automatic in. Once a week we have them read passages from science and history three times. They time themselves and record in their vocabulary book any words that were unfamiliar to them in the reading of the passages. We use Marzanos Six Step vocabulary process to help them learn these words-I look at it as sight word training in some of the disciplines. They keep a little booklet of all of their words and we have them revisit the words several times. They web them, play games with them, and discuss them as much as possible. I feel like high school students don't really get a lot of fluency training, especially in the disciplines and I really have found value in it. Some of the students expressed appreciation of the vocab practice as some of the words showed up on the ACT. This helps them feel more confident with the reading in the disciplines as well. Great job summarizing all of the cognitive theories. I appreciated reading your blog to help keep some of them straight in my own head. There are many to keep track of, and I appreciate the opportunity to tie these theories into our classroom practices in discussions and writing.

    ReplyDelete