How Knowledge Helps

It Speeds and Strengthens Reading Comprehension, Learning—and Thinking

"Knowledge is Good." So read the motto of the mythical Faber College in the 1978 movie, Animal House. Those of us who work in education would agree, even if we were unable to express ourselves so eloquently. But why, exactly, is knowledge good? When I've discussed this question with teachers, many have used the metaphor "It's grist for the mill." That is, the goal of education is seen not so much as the accumulation of knowledge, but as the honing of cognitive skills such as thinking critically. Knowledge comes into play mainly because if we want our students to learn how to think critically, they must have something to think about.

It's true that knowledge gives students something to think about, but a reading of the research literature from cognitive science shows that knowledge does much more than just help students hone their thinking skills: It actually makes learning easier. Knowledge is not only cumulative, it grows exponentially. Those with a rich base of factual knowledge find it easier to learn more—the rich get richer. In addition, factual knowledge enhances cognitive processes like problem solving and reasoning. The richer the knowledge base, the more smoothly and effectively these cognitive processes—the very ones that teachers target—operate. So, the more knowledge students accumulate, the smarter they become. We'll begin by exploring how knowledge brings more knowledge and then turn to how knowledge improves the quality and speed of thinking.

I. How Knowledge Brings More Knowledge

The more you know, the easier it will be for you to learn new things. Learning new things is actually a seamless process, but in order to study it and understand it better, cognitive scientists have approached it as a three-stage process. And they've found that knowledge helps at every stage: as you first take in new information (either via listening or reading), as you think about this information, and as the material is stored in memory. We'll consider each of these stages in turn.

How Knowledge Helps You Take in New Information

The first stage in which factual knowledge gives you a cognitive edge is when you are taking in new information, whether by listening or reading. There is much more to comprehending oral or written language than knowing vocabulary and syntax. Comprehension demands background knowledge because language is full of semantic breaks in which knowledge is assumed and, therefore, comprehension depends on making correct inferences. In a casual conversation, the listener can gather missing background knowledge and check on his inferences by asking questions (e.g., Did you mean Bob Smith or Bob Jones? What do you mean when you describe him as an entrepreneur?)—but this is not the case when watching a movie or reading a book. (And sometimes it isn't the case in class when a student is too embarrassed to ask a question.)

To provide some concrete examples and simplify the discussion, let's focus on reading—but keep in mind that the same points apply to listening. Suppose you read this brief text: "John's face fell as he looked down at his protruding belly. The invitation specified ‘black tie' and he hadn't worn his tux since his own wedding, 20 years earlier." You will likely infer that John is concerned that his tuxedo won't fit, although the text says nothing directly about this potential problem. The writer could add the specifics ("John had gained weight since he last wore his tuxedo, and worried that it would not fit"), but they are not necessary and the added words would make the text dull. Your mind is well able to fill in the gaps because you know that people are often heavier 20 years after their wedding, and that gaining weight usually means that old clothing won't fit. This background knowledge about the world is readily available and so the writer need not specify it.

Thus, an obvious way in which knowledge aids the acquisition of more knowledge lies in the greater power it affords in making correct inferences. If the writer assumes that you have some background knowledge that you lack, you'll be confused. For example, if you read, "He was a real Benedict Arnold about it" and you don't know who Benedict Arnold was, you're lost. This implication of background knowledge is straightforward and easy to grasp. It is no surprise, then, that the ability to read a text and make sense of it is highly correlated with background knowledge (Kosmoski, Gay, and Vockell, 1990). If you know more, you're a better reader.

Most of the time you are unaware of making inferences when you read. For example, when you read the text above it's unlikely you thought to yourself, "Hmmm ... let me see now ... why am I being told about the last time he wore his tuxedo? Why would thinking about that make his face fall?" Those conscious inferences are unnecessary because the cognitive processes that interpret what you read automatically access not just the literal words that you read, but also ideas associated with those words. Thus, when you read "tux," the cognitive processes that are making sense of the text can access not just "a formal suit of clothing," but all of the related concepts in your memory: Tuxedos are expensive, they are worn infrequently, they are not comfortable, they can be rented, they are often worn at weddings, and so on. As the text illustrates, the cognitive processes that extract meaning also have access to concepts represented by the intersection of ideas; "tux" makes available "clothing," and "20 years after wedding" makes available "gaining weight." The intersection of "clothing" and "gaining weight" yields the idea "clothing won't fit" and we understand why John is not happy. All of these associations and inferences happen outside of awareness. Only the outcome of this cognitive process—that John is concerned his tux won't fit anymore—enters consciousness.

Sometimes this subconscious inference-making process fails and the ideas in the text cannot be connected. When this happens, processing stops and a greater effort is made to find some connection among the words and ideas in the text. This greater effort requires conscious processing. For example, suppose that later in the same text you read, "John walked down the steps with care. Jeanine looked him up and down while she waited. Finally she said, ‘Well, I'm glad I've got some fish in my purse.'" Jeanine's comment might well stop the normal flow of reading. Why would she have fish? You would search for some relationship between carrying fish to a formal event and the other elements of the situation (formal wear, stairs, purses, what you've been told of Jeanine and John). In this search you might retrieve the popular notion that wearing a tuxedo can make one look a little like a penguin, which immediately leads to the association that penguins eat fish. Jeanine is likening John to a penguin and thus she is teasing him. Sense is made, and reading can continue. Here, then, is a second and more subtle benefit of general knowledge: People with more general knowledge have richer associations among the concepts in memory; and when associations are strong, they become available to the reading process automatically. That means the person with rich general knowledge rarely has to interrupt reading in order to consciously search for connections.

This phenomenon has been verified experimentally by having subjects read texts on topics with which they are or are not very familiar. For example, Johanna Kaakinen and her colleagues (2003) had subjects read a text about four common diseases (e.g., flu) for which they were likely already familiar with the symptoms, and a text about four uncommon diseases (e.g., typhus) for which they likely were not. For each text, there was additional information about the diseases that subjects likely did not know.

The researchers used a sophisticated technology to unobtrusively measure where subjects fixated their eyes while they read each text. Researchers thus had a precise measure of reading speed, and they could tell when subjects returned to an earlier portion of the text to reread something. The researchers found that when reading unfamiliar texts, subjects more often reread parts of sentences and they more often looked back to previous sentences. Their reading speed was also slower overall compared to when they read familiar texts. These measures indicate that processing is slower when reading about something unfamiliar to you.

Thus, background knowledge makes one a better reader in two ways. First, it means that there is a greater probability that you will have the knowledge to successfully make the necessary inferences to understand a text (e.g., you will know that people are often heavier 20 years after their wedding and, thus, John is worried that his tux won't fit). Second, rich background knowledge means that you will rarely need to reread a text in an effort to consciously search for connections in the text (e.g., you will quickly realize that with her fish remark, Jeanine is likening John to a penguin).

How Knowledge Helps You Think about New Information

Comprehending a text so as to take in new information is just the first stage of learning that new information; the second is to think about it. This happens in what cognitive scientists call working memory, the staging ground for thought. Working memory is often referred to metaphorically as a space to emphasize its limited nature; one can maintain only a limited amount of information in working memory. For example, read through this list one time, then look away and see how many of the letters you can recall.

CN

NFB

ICB

SCI

ANC

AA

There were 16 letters on the list, and most people can recall around seven—there is not sufficient space in working memory to maintain more than that. Now try the same task again with this list.

CNN

FBI

CBS

CIA

NCAA

Much easier, right? If you compare the two lists, you will see that they actually contain the same letters. The second list has been reorganized in a way that encourages you to treat C, N, and N as a single unit, rather than as three separate letters. Putting items together this way is called chunking. It greatly expands how much fits in your working memory—and, therefore, how much you can think about. The typical persons' working memory can hold about seven letters or almost the same number of multi-letter chunks or pieces of information. Note, however, that chunking depends on background knowledge. If you weren't familiar with the abbreviation for the Federal Bureau of Investigation, you couldn't treat FBI as a single chunk.

The ability to chunk and its reliance on background knowledge has been tested in a number of studies. These studies show that this ability makes people better able to briefly remember a list of items, just as you could remember more letters in the second example. This benefit has been observed in many domains, including chess (Chase and Simon, 1973), bridge (Engle and Bukstel, 1978), computer programming (McKeithen, Reitman, Rueter, and Hirtle, 1981), dance steps (Allard and Starkes, 1991), circuit design (Egan and Schwartz, 1979), maps (Gilhooly, Wood, Kinnear, and Green, 1988), and music (Sloboda, 1976).

Of course, we seldom want to briefly remember a list. The important aspect of chunking is that it leaves more free space in working memory, allowing that space to be devoted to other tasks, such as recognizing patterns in the material. For example, in one study (Recht and Leslie, 1988), the researchers tested junior high school students who were either good or poor readers (as measured by a standard reading test) and who were also knowledgeable or not about the game of baseball (as measured by a test created for the study by three semi-professional baseball players). The children read a passage written at an early 5th-grade reading level that described a half inning of a baseball game. The passage was divided into five parts, and after each part the student was asked to use a replica of a baseball field and players to reenact and describe what they read. The researchers found that baseball knowledge had a big impact on performance: Poor readers with a high knowledge of baseball displayed better comprehension than good readers with a low knowledge of baseball.

What's going on here? First, the students with a lot of knowledge of baseball were able to read a series of actions and chunk them. (For example, if some of the text described the shortstop throwing the ball to the second baseman and the second baseman throwing the ball to the first baseman resulting in two runners being out, the students with baseball knowledge would chunk those actions by recognizing them as a double play—but the students without baseball knowledge would have to try to remember the whole series of actions.) Second, because they were able to chunk, the students with baseball knowledge had free space in their working memory that they could devote to using the replica to reenact the play as well as providing a coherent verbal explanation. Without being able to chunk, the students with little baseball knowledge simply didn't have enough free space in their working memory to simultaneously remember all of the actions, keep track of their order, do the reenactment, and describe the reenactment.

This study illustrates the importance of the working memory advantage that background knowledge confers (see also Morrow, Leirer and Altieri, 1992; Spilich, Vesonder, Chiesi, and Voss, 1979). Most of the time when we are listening or reading, it's not enough to understand each sentence on its own—we need to understand a series of sentences or paragraphs and hold them in mind simultaneously so that they can be integrated or compared. Doing so is easier if the material can be chunked because it will occupy less of the limited space in working memory. But, chunking relies on background knowledge.

How Knowledge Helps You Remember New Information

Knowledge also helps when you arrive at the final stage of learning new information—remembering it. Simply put, it is easier to fix new material in your memory when you already have some knowledge of the topic (Arbuckle et al, 1990; Beier and Ackerman, 2005; Schneider, Korkel, and Weinert, 1989; Walker, 1988). Many studies in this area have subjects with either high or low amounts of knowledge on a particular topic read new material and then take a test on it some time later; inevitably those with prior knowledge remember more.

A study by David Hambrick (2003) is notable because it looked at real-world learning and did so over a longer period of time than is typical in such studies. First, Hambrick tested college students for their knowledge of basketball. This test took place in the middle of the college basketball season. Two and one-half months later (at the end of the season), subjects completed questionnaires about their exposure to basketball (e.g., game attendance, watching television, and reading magazines or newspapers) and also took tests that measured their knowledge of specific men's basketball events from the prior two and one-half months. The results showed (not surprisingly) that subjects who reported an interest in the game also reported that they had had greater exposure to basketball information. The more interesting finding was that, for a given level of exposure, greater prior basketball knowledge was associated with more new basketball knowledge. That is, the people who already knew a lot about basketball tended to remember more basketball-related news than people with the same exposure to this news but less prior knowledge.* As I said in the introduction, the rich get richer.

What's behind this effect? A rich network of associations makes memory strong: New material is more likely to be remembered if it is related to what is already in memory. Remembering information on a brand new topic is difficult because there is no existing network in your memory that the new information can be tied to. But remembering new information on a familiar topic is relatively easy because developing associations between your existing network and the new material is easy.

*  *  *

Some researchers have suggested that prior knowledge is so important to memory that it can actually make up for or replace what we normally think of as aptitude. Some studies have administered the same memory task to high-aptitude and low-aptitude children, some of whom have prior knowledge of the subject matter and some of whom do not; the studies found that only prior knowledge is important (Britton, Stimson, Stennett, and Gülgöz, 1998; Recht and Leslie, 1988; Schneider, Korkle, and Weinert, 1989; Walker, 1988). But some researchers disagree. They report that, although prior knowledge always helps memory, it cannot eliminate the aptitude differences among people. Since everyone's memory gets better with prior knowledge, assuming equal exposure to new knowledge (as in a classroom without extra support for slower students), the student with overall lower aptitude will still be behind the student with higher aptitude (Hall and Edmondson, 1992; Hambrick and Engle, 2002; Hambrick and Oswald, 2005; Schneider, Bjorklund, and Maier-Brückner, 1996). In the end, the issue is not settled, but as a practical matter of schooling, it doesn't matter much. What matters is the central, undisputed finding: All students will learn more if they have greater background knowledge.

II. How Knowledge Improves Thinking

Knowledge enhances thinking in two ways. First, it helps you solve problems by freeing up space in your working memory. Second, it helps you circumvent thinking by acting as a ready supply of things you've already thought about (e.g., if you've memorized that 5 + 5 = 10, you don't have to draw two groups of five lines and count them). To simplify the discussion, I'll focus mostly on research that explores the benefits of knowledge for problem solving, which is essentially the type of thinking that students must do in mathematics and science classes. But keep in mind that in much the same way, knowledge also improves the reasoning and critical thinking that students must do in history, literature, and other humanities classes.

How Knowledge Helps You Solve Problems

In the last section, I discussed one way that prior knowledge helps reading: It allows you to chunk some information, which leaves more room in working memory to sort through the implications of a text. You get much the same benefit if you are trying to solve a problem. If you don't have sufficient background knowledge, simply understanding the problem can consume most of your working memory, leaving no space for you to consider solutions. I can give you a sense of this impact with a sample problem called the Tower of Hanoi. The picture shows three pegs with three rings of increasing size. The goal is to move all the rings to the rightmost peg. There are just two rules: You can only move one ring at a time, and you can't put a larger ring on top of a smaller ring. See if you can solve the problem.

[[{"type":"media","view_mode":"wysiwyg","fid":"926","attributes":{"alt":"Illustration of the \\\"Tower of Hanoi" problem","height":"207","width":"276","style":"height: 207px; width: 276px;","class":"media-image media-element file-wysiwyg"},"link_text":null}]]

With some diligence, you may well be able to solve the problem. The solution is to move the rings as follows: A3, B2, A2, C3, A1, B3, A3.

Now consider this problem:

In the inns of certain Himalayan villages is practiced a refined tea ceremony. The ceremony involves a host and exactly two guests, neither more nor less. When his guests have arrived and seated themselves at his table, the host performs three services for them. These services are listed in the order of the nobility the Himalayans attribute to them: stoking the fire, fanning the flames, and pouring the tea. During the ceremony, any of those present may ask another, "Honored Sir, may I perform this onerous task for you?" However, a person may request of another only the least noble of the tasks that the other is performing. Furthermore, if a person is performing any tasks, then he may not request a task that is nobler than the least noble task he is already performing. Custom requires that by the time the tea ceremony is over, all the tasks will have been transferred from the host to the most senior of the guests. How can this be accomplished?

You would probably have to read the problem several times just to feel that you understand it—but this problem is actually identical to the Tower of Hanoi. Each guest is like a peg, and each task is like a ring. The goal and the rules of transfer are the same. The difference is that this version is much more demanding of working memory. The first version does not require you to maintain the problem in working memory because it is so effectively represented in the figure. The second version requires that the solver remember the order of nobility of the tasks, whereas in the first version you can easily chunk the order of ring size—smallest to largest.

These two problems give you a sense of the advantages of background knowledge for problem solving. The problem solver with background knowledge in a particular domain sees problems in her domain like the Tower of Hanoi; everything is simple and easy to understand. When she is outside her domain, however, the same problem solver cannot rely on background knowledge and problems seem more like the confusing tea ceremony. It's all she can do to simply understand the rules and the goal.

These examples put the "grist for the mill" metaphor in a new light: It's not sufficient for you to have some facts for the analytic cognitive processes to operate on. There must be lots of facts and you must know them well. The student must have sufficient background knowledge to recognize familiar patterns—that is, to chunk—in order to be a good analytical thinker. Consider, for example, the plight of the algebra student who has not mastered the distributive property. Every time he faces a problem with a(b + c), he must stop and plug in easy numbers to figure out whether he should write a(b) + c or a + b(c) or a(b) + a(c). The best possible outcome is that he will eventually finish the problem—but he will have taken much longer than the students who know the distributive property well (and, therefore, have chunked it as just one step in solving the problem). The more likely outcome is that his working memory will become overwhelmed and he either won't finish the problem or he'll get it wrong.

How Knowledge Helps You Circumvent Thinking

It's not just facts that reside in memory; solutions to problems, complex ideas you've teased apart, and conclusions you've drawn are also part of your store of knowledge. Let's go back to the algebra students for a moment. The student who does not have the distributive property firmly in memory must think it through every time he encounters a(b + c), but the student who does, circumvents this process. Your cognitive system would indeed be poor if this were not possible; it is much faster and less demanding to recall an answer than it is to solve the problem again. The challenge, of course, is that you don't always see the same problem, and you may not recognize that a new problem is analogous to one you've seen before. For example, you may have successfully solved the Tower of Hanoi problem and moments later not realized that the tea ceremony problem is analogous.

Fortunately, knowledge also helps with this: A considerable body of research shows that people get better at drawing analogies as they gain experience in a domain. Whereas novices focus on the surface features of a problem, those with more knowledge focus on the underlying structure of a problem. For example, in a classic experiment Michelene Chi and her colleagues (Chi, Feltovich, and Glaser, 1981) asked physics novices and experts to sort physics problems into categories. The novices sorted by the surface features of a problem—whether the problem described springs, an inclined plane, and so on. The experts, however, sorted the problems based on the physical law needed to solve it (e.g., conservation of energy). Experts don't just know more than novices—they actually see problems differently. For many problems, the expert does not need to reason, but rather, can rely on memory of prior solutions.

Indeed, in some domains, knowledge is much more important than reasoning or problem-solving abilities. For example, most of the differences among top chess players appear to be in how many game positions they know, rather than in how effective they are in searching for a good move. It seems that there are two processes to selecting a move in chess. First, there is a recognition process by which a player sees which part of the board is contested, which pieces are in a strong or weak position, and so forth. The second process is one of reasoning. The player considers possible moves and their likely outcome. The recognition process is very fast, and it identifies which pieces the slower reasoning process should focus on. But the reasoning process is very slow as the player consciously considers each possible move. Interestingly, a recent study indicates that the recognition process accounts for most of the differences among top players. Burns (2004) compared the performance of top players at normal and blitz tournaments. In blitz chess, each player has just five minutes to complete an entire game, whereas in a normal tournament, players would have at least two hours. Even though play was so sped up that the slow reasoning processes barely had any time to contribute to performance, the relative ratings of the players were almost unchanged. That indicates that what's making some players better than others is differences in their fast recognition processes, not differences in their slow reasoning processes. This finding is rather striking. Chess, the prototypical game of thinking and reflection, turns out to be largely a game of memory among those who are very skilled. Some researchers estimate that the best chess players have between 10,000 and 300,000 chess-piece chunks in memory (Gobet and Simon, 2000).

Burns's (2004) study of chess skill meshes well with studies of science education. A recent meta-analysis (Taconis, Feguson-Hessler, and Broekkamp, 2001) evaluated the results of 40 experiments that studied ways to improve students' scientific problem-solving skills. The results showed that the successful interventions were those that were designed to improve students' knowledge base. Especially effective were those in which students were asked to integrate and relate different concepts by, for example, drawing a concept map or comparing different problems. Interventions designed to improve the students' scientific problem-solving strategies had little or no impact, even though the goal of all the studies was to improve scientific problem solving.

We've seen how knowledge improves learning and thinking. But what does this mean for the classroom? My sidebar "Knowledge in the Classroom" offers some strategies for building students' store of knowledge.


Daniel T. Willingham is professor of cognitive psychology at the University of Virginia and author of Cognition: The Thinking Animal. He is author of American Educator's regular feature, "Ask the Cognitive Scientist." His research focuses on the role of consciousness in learning.

*Careful readers may notice that in this study there is some possibility that the college students' interest in basketball (not just their knowledge) could have some effect on their memory of basketball events. A more complicated study controlled for interest by creating experts. Subjects were brought in to pre-learn some information (which then served as their background knowledge) and then return two days later to learn additional knowledge. The researchers still found a memory boost from background knowledge (Van Overschelde and Healy, 2001). (back to article)

References

Allard, F., and Starkes, J. L. (1991). Motor-skill experts in sports, dance, and other domains. In K. A. Ericsson and J. Smith (eds.), Toward a general theory of expertise: Prospects and limits (pp. 126–152). New York:Cambridge University Press.

Arbuckle, T. Y., Vanderleck, V. F., Harsany, M., and Lapidus, S. (1990). Adult age differences in memory in relation to availability and accessibility of knowledge-based schemas. Journal of Experimental Psychology: Learning, Memory, and Cognition, 16, 305–315.

Beier, M. E. and Ackerman, P. L. (2005). Age, ability and the role of prior knowledge on the acquisition of new domain knowledge: Promising results in a real-world environment. Psychology and Aging, 20, 341–355.

Britton, B. K., Stimson, M., Stennett, B., and Gülgöz, S. (1998). Learning from instructional text: Test of an individual differences model. Journal of Educational Psychology, 90, 476–491.

Burns, B. B. (2004). The effects of speed on skilled chess performance. Psychological Science, 15, 442–447.

Chase, W. G., and Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55–81.

Chi, M. T. H, Feltovich, P. and Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121–152.

Egan, D. E., and Schwartz, B. J. (1979). Chunking in recall of symbolic drawings. Memory & Cognition, 7, 149–158.

Engle, R. W., and Bukstel, L. (1978). Memory processes among bridge players of differing expertise. American Journal of Psychology, 91, 673–689.

Gilhooly, K. J., Wood, M., Kinnear, P. R., and Green, C. (1988). Skill in map reading and memory for maps. Quarterly Journal of Experimental Psychology: Human Experimental Psychology, 40, 87–107.

Gobet, F. and Simon, H.A. (2000). Five seconds or sixty? Presentation time in expert memory. Cognitive Science, 24, 651–682.

Hall, V. C. and Edmondson, B. (1992). Relative importance of aptitude and prior domain knowledge on immediate and delayed post-tests. Journal of Educational Psychology, 84, 219–223.

Hambrick, D. Z. (2003). Why are some people more knowledgeable than others? A longitudinal study of knowledge acquisition. Memory & Cognition, 31, 902–917.

Hambrick, D. Z. and Engle, R. W. (2002). Effects of domain knowledge, working memory capacity, and age on cognitive performance: An investigation of the knowledge-is-power hypothesis. Cognitive Psychology, 44, 339–387.

Hambrick, D. Z. and Oswald, F. L. (2005). Does domain knowledge moderate involvement of working memory capacity in higher-level cognition? A test of three models. Journal of Memory and Language, 52, 377–397.

Kaakinen, J. K. Hyönä, J. and Keenan, J. M. (2003). How prior knowledge, WMC, and relevance of information affect eye fixations in expository text. Journal of Experimental Psychology: Learning, Memory, and Cognition, 29, 447–457.

Kosmoski, G. J., Gay, G., and Vockell, E. L. (1990). Cultural Literacy and Academic Achievement. Journal of Experimental Education, 58, 4, p. 265–272, Summer.

McKeithen, K. B., Reitman, J. S., Rueter, H. H., and Hirtle, S. C. (1981). Knowledge organization and skill differences in computer programmers. Cognitive Psychology, 13, 307–325.

Morrow, D. G., Leirer, V. O., and Altieri, P. A. (1992). Aging, expertise, and narrative processing. Psychology and Aging, 7, 376–388.

Recht, D. R. and Leslie, L. (1988). Effect of prior knowledge on good and poor readers' memory of text. Journal of Educational Psychology, 80, 16–20.

Schneider, W., Bjorklund, D. F. and Maier-Brückner, W. (1996). The effects of expertise and IQ on children's memory: When knowledge is, and when it is not enough. International Journal of Behavioral Development, 19, 773–796.

Schneider, W., Korkel, J., and Weinert, F. E. (1989). Domain-specific knowledge and memory performance: A comparison of high- and low-aptitude children. Journal of Educational Psychology, 81, 306–312.

Sloboda, J. (1976). Visual perception of musical notation: Registering pitch symbols in memory. Quarterly Journal of Experimental Psychology, 28, 1–16.

Spilich, G. J., Vesonder, G. T, Chiesi, H. L., and Voss, J. F. (1979). Text processing of domain-related information for individuals with high- and low-domain knowledge. Journal of Verbal Learning and Verbal Behavior, 18, 275–290.

Taconis, R., Ferguson-Hessler, M. G. M., and Broekkamp, H. (2001). Teaching science problem solving: An overview of experimental work. Journal of Research in Science Teaching, 38, 442–468.

Van Overschelde, J. P. and Healy, A. F. (2001). Learning of nondomain facts in high- and low-knowledge domains. Journal of Experimental Psychology: Learning, Memory, and Cognition, 27, 1160–1171.

Walker, C. H. (1988). Relative importance of domain knowledge and overall aptitude on acquisition of domain-related information. Cognition and Instruction, 4, 25–42.

Related Articles

Knowledge: The Next Frontier in Reading Comprehension

Building Knowledge
The Case for Bringing Content into the Language Arts Block and for a Knowledge-Rich Curriculum Core for All Children
By E. D. Hirsch, Jr.

What What Do Reading Comprehension Tests Mainly Measure? Knowledge

Engaging Kids with Content: "The Kids Love It" (PDF)

How We Neglect Knowledge—and Why (PDF)
By Susan B. Neuman

Why the Absence of a Content-Rich Curriculum Core Hurts Poor Children Most (PDF)

How Knowledge Helps
It Speeds and Strengthens Reading Comprehension, Learning—and Thinking
By Daniel T. Willingham

Knowledge in the Classroom

American Educator, Spring 2006