Academia.eduAcademia.edu
Computers in Human Behavior Computers in Human Behavior 21 (2005) 487–508 www.elsevier.com/locate/comphumbeh Information problem solving by experts and novices: analysis of a complex cognitive skill Saskia Brand-Gruwel a a,* , Iwan Wopereis a, Yvonne Vermetten b Open University of the Netherlands, Educational Technology Expertise Center, P.O. Box 2960, 6401 DL Heerlen, The Netherlands b NHTV Breda University of Professional Education, The Netherlands Available online 18 November 2004 Abstract In (higher) education students are often faced with information problems: tasks or assignments that require them to identify information needs, locate corresponding information sources, extract and organize relevant information from each source, and synthesize information from a variety of sources. It is often assumed that students master this complex cognitive skill of information problem solving all by themselves. In our point of view, however, explicit and intensive instruction is necessary. A skill decomposition is needed in order to design instruction that fosters the development of information problem solving. This research analyzes the information problem solving process of novices and experts in order to reach a detailed skill decomposition. Results reveal that experts spend more time on the main skill Ôdefine problemÕ and more often activate their prior knowledge, elaborate on the content, and regulate their process. Furthermore, experts and novices show little differences in the way they search the Internet. These findings formed the basis for formulating instructional guidelines. Ó 2004 Elsevier Ltd. All rights reserved. 1. Introduction Our current society is transforming into an information society. Both social and technological developments have contributed to a situation where information plays * Corresponding author. E-mail address: saskia.brand-gruwel@ou.nl (S. Brand-Gruwel). 0747-5632/$ - see front matter Ó 2004 Elsevier Ltd. All rights reserved. doi:10.1016/j.chb.2004.10.005 488 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 a key role (see Boekhorst, 2000). According to Marchionini (1999) the proliferation of electronic information technologies for computation and communication has speeded up the transformation process. However, these new technologies require people to manage the overload of information adequately. People must be able to identify information needs, to locate corresponding information sources, to extract and organize relevant information from each source, and to synthesize information from a variety of sources into cogent, productive uses (Moore, 1995). All the skills, knowledge and attitudes, which are needed to carry out the above-mentioned activities, can be defined as information literacy (Bawden, 2001; Marchionini, 1999; Shapiro & Hughes, 1996; Spitzer, 2000) or as information problem solving (Eisenberg & Berkowitz, 1990, 1992; Moore, 1995, 1997). It is not surprising that the importance of information literacy or the ability to solve information problems in our current society has its repercussions on education. In contemporary education – due to a shift towards a learning-focused paradigm in instructional theory (see Reigeluth, 1999) – new curricula emerge that often appeal to information problem-solving skills. Examples are environments for resource-based learning (Hill & Hannafin, 2001; Macdonald, Heap, & Mason, 2001), problem-based learning (Savery & Duffy, 1995), project-based learning (Land & Greene, 2000), and competence-based learning (Kirschner, Valcke, & Van Vilsteren, 1997). The transmission of knowledge is no longer the primary educational aim. Students are expected to construct their own knowledge, search and process information and combine it with their prior knowledge in order to tackle authentic tasks and problems. Nowadays, the skill of information problem solving has thus become particularly important. It is often assumed that students master this skill all by themselves. From our point of view, however, explicit and intensive instruction is required. Information problem solving can be characterized as a complex cognitive skill, because it takes considerable time to achieve an adequate level of competence (cf., Van Merriënboer, 1997). So, attention should be paid to the design of effective instruction. A skills analysis or skill decomposition is needed to design such instruction. The four main functions of a skill decomposition are the (1) identification, (2) description, and (3) classification of sub skills, as well as (4) the specification of a macrolevel sequence according to which the sub skills will be dealt with in the instructional program (Van Merriënboer, 1997). The identification of sub skill requires the development of a ‘‘skills hierarchy’’, where after the identified sub skills can be described in performance objectives. The present study has chosen a comparison between novices and advanced information problem solvers (in the rest of this article called experts) to analyze the complex skill of information problem solving. By choosing this approach two results were attained: (1) a decomposition and further analysis of the complex cognitive skill, and (2) insight in the critical (sub) skills that distinguish experts from novices. Instructional guidelines will be derived from these results. In the last decades the process of solving an information problem has been extensively studied. Wilson (1999) describes a series of three ‘‘nested’’ research fields, which makes it possible to place the research on information problem solv- S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 489 ing in a broader perspective. Research on information problem solving is best placed within the research field of Ôinformation-seeking behaviorÕ. Research in this area is aimed at unraveling human behavior while searching, acquiring, processing, organizing and presenting information (Ellis, Cox, & Hall, 1993; Kuhltau, 1993). A sub set of Ôinformation-seeking behaviorÕ is Ôinformation-searching behaviorÕ. Research within this field is focused on clarifying the process of searching and locating information (Hill, 1999; Hölscher & Strube, 2000; Lazonder, 2000, 2003; Marchionini, 1995; Sutcliffe & Ennis, 1998; Zins, 2000). Especially the use of (electronic) information retrieval systems as a possible strategy in the collection of information is an important research topic within the domain of Ôinformation-searching behaviorÕ (Ingwersen, 1996; Spink, 1997). Since electronic information retrieval systems, like hypertext databases, online public access catalogues and particularly the Internet are widely used, it is not surprising that contemporary research on both Ôinformation-seeking behaviorÕ and Ôinformationsearching behaviorÕ is mainly focusing on seeking and searching behavior while using electronic systems. ÔInformation-seeking behaviorÕ and its sub set Ôinformation searching behaviorÕ are nested within the research area of Ôinformation behaviorÕ. Research in this larger field concerns not only intentional information behavior but also unintentional behavior; for instance, passively watching a television commercial (Wilson, 1999). Research concerning the process of information problem solving as being a part of the field of Ôinformation-seeking behaviorÕ resulted in a variety of descriptive and prescriptive models (Eisenberg & Berkowitz, 1990; Ellis et al., 1993; Irving, 1985; Kuhltau, 1993; Stripling & Pitts, 1988). One of these models, the ÔBig6TM-modelÕ of Eisenberg and Berkowitz (Eisenberg & Berkowitz, 1990, 1992, 2000; Eisenberg & Johnson, 2002), is selected as a point of departure for setting up a preliminary model for information problem solving. The preliminary model serves as a frame of reference for the construction of the instruments for analyzing the experimental data in order to develop a skills hierarchy. The ÔBig6TMÕ is chosen, because it fits in the various stages of the information-seeking process. Moreover this model has proven to be a successful (Eisenberg, 2003) and effective (Wolf, Brush, & Saye, 2003) prescriptive educational model. The ÔBig6TMÕ distinguishes six stages in order to foster the information problem-solving skills: (a) task definition, (b) information-seeking strategies, (c) location and access, (d) use of information, (e) synthesis, and (f) evaluation. Based on criticisms (see Boekhorst, 2000; MacKenzie, 1994), the model is adjusted and transformed into a descriptive model. The most essential alteration in the model is the addition of a regulation category. Like in other process models higher-order thinking skills like general problem solving and metacognitive processing, are underexposed in the ÔBig6TMÕ. Therefore, in our first attempt to disentangle the process of information problem solving, explicit attention is paid to metacognition by adding regulation as an important new component to the model. The evaluation stage in the ÔBig6TM-modelÕ is transposed to the regulation component, since (summative) evaluation of the process and the product is regarded as a regulation activity (Vermunt, 1998). Fig. 1 presents the preliminary model. 490 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 REGULATION Define the information problem Select sources of information Search and find information Process information Organize and present information Fig. 1. Preliminary descriptive model for information problem solving. 1.1. Define the information problem The process of information problem solving starts with the ÔrecognitionÕ of a need for information. A thorough identification of this need can be regarded as the determination or definition of a problem. A problem definition is comprehensive when a clear description of the problem and the type and amount of information required for solving it, are given. During the process of defining the problem, prior knowledge is activitated in memory. This is important, because activating prior knowledge eases the integration of the Ôto be foundÕ information with knowledge that is already available (Hill, 1999; Moore, 1995). Defining an inclusive problem definition is very essential for the process of information problem solving. For instance, research of Land and Greene (2000) indicates that a goal-driven approach, which is supported by formulating a clear problem, leads to better results than an undirected, datadriven approach. 1.2. Select sources of information Once the information problem is formulated, sources of information for solving the problem are considered. Interesting sources of information are reflected on and based on certain criteria, such as reliability, validity, preciseness, completeness, accuracy, availability, novelty and costs, sources are selected and prioritized. This process results in a plan or search strategy. 1.3. Search and find information Searching and finding information aims at searching the selected sources and finding information within these sources. The formulated search strategy facilitates the search for information. Once a source is found, it is not studied in-depth. Instead, the information within a source is scanned and typified as relevant or irrelevant. When all the selected sources are searched for and sufficient information is located within these sources, a more profound examination of the information follows. This is seen as part of information processing. The search strategy for searching and finding information is adapted in case inadequate information is located within the selected sources. Boekhorst (2000) emphasizes that, in order to be successful in searching and finding information, knowledge about information and communication technology is S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 491 essential. Other researchers also pointed out the importance of computer skills in this action-directed phase (e.g., Marchionini, 1995; Sutcliffe & Ennis, 1998). 1.4. Process information During processing the information, the information found is studied thoroughly. This means that information found is (again) selected, analyzed in-depth, related to prior knowledge and (re)structured in order to reach deep understanding (see Dochy, 1993; Schmeck & Geisler-Brenstein, 1989). While processing information, the relevance and quality of the information is continuously examined and related to the problem stated. 1.5. Organize and present information The process of organizing and presenting information concerns the synthesis of relevant information into cogent, productive uses. The form of the product depends on the task one has to perform. Examples of products are reports, articles, letters, lectures and presentations. Although products differ, the process that leads to them is similar. This process can be characterized as a structured, iterative process of (re)organizing and fixing information. 1.6. Regulation of information problem solving Regulation plays an important role for the coordination of the process of information problem solving. Various classifications of regulation activities are formulated (see De Jong & Simons, 1988; Vermunt, 1995). In our preliminary model for information problem solving a set of regulation activities, based on research of De Jong (1992) and De Jong and Simons (1988), is used: orientation towards a task or problem, steering the process, monitoring the process, and testing process and product. Orientation towards a task includes (a) the analysis of the task, and/or (b) the task performance. The current situation is taken into account, the assignment (task) and the product asked for is (re)considered, and time on task as well as prior knowledge and competency is examined. Based on an orientation a well-grounded decision about (further) task performance can be made. Steering is focused on the decision what activities have to be performed. Steering occurs on a macro level (planning) and on a micro level (deciding what to do next). Monitoring the process means that someone keeps an eye on task performance. It is less profound than an orientation towards a task. Testing is aimed at evaluating process and product. When this is done during task performance it is formative. In case both process and product are evaluated at the end of the process of information problem solving this is called summative. Summative evaluation is relevant for fine-tuning future performance. Research of Hill (1999), Hill and Hannafin (1997), Land and Greene (2000) and Marchionini (1995) has revealed that the quality of regulation is directly related to the effectiveness and efficiency of the information problem-solving process. There is also evidence that the use of metacognitive knowledge and skills during the process 492 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 can compensate for a lack of subject matter knowledge (Land & Greene, 2000; Moore, 1995). As stated earlier, the aim of this study is to make, with the preliminary model as a starting point, a decomposition of the complex cognitive skill Ôsolving information problemsÕ. A second goal is to make a comparison between experts and novices, in order to be able to classify and sequence the (sub) skills for designing instruction. The main questions concerning the expert-novice comparison are: To what extent does the information problem-solving process of the experts differ from the process of the novices, with regard to (1) the time investment in performing the main skills, (2) the frequency of performing the main skills and their sub skills, (3) the use of particular search strategies while searching information on the Internet, (4) the use of particular regulation activities, and (5) the quality of the products produced? The answers to these questions will help to develop guidelines for the development of instruction fostering the complex skill of information problem solving. 2. Method 2.1. Participants Five experts and five novices voluntarily participated in the study. The experts were PhD-students (two female, three male) in the field of Educational Technology from the Open University of the Netherlands. They were all in their final year. The novices were Psychology freshmen (four female, one male) from the University of Maastricht. 2.2. Materials 2.2.1. Task The participants were asked to solve an information problem while thinking aloud. The task description was: ÔHow must we deal with the perishability of food? Can we consume food that is out of date? Or must we rely on our senses? Write (in Microsoft Word) an argument of about 400 words, which is meant for a consumersÕ magazine. Use information from the Internet to build up your argumentation.Õ The topic perishability was chosen because it was expected that the prior knowledge on this would not be too different between the participants. 2.2.2. Instrument to analyze the thinking aloud protocols An inductive-deductive method was used to develop the coding system for analyzing the thinking aloud protocols. The coding system was based on the protocols and the model described in the Introduction, and was tested and re-adjusted in a few iterations. During these iterations it became clear that no statements were found concerning the main skill Ôselect sourcesÕ. The purpose of this skill is to come up with a selection of appropriate sources and to define on forehand a search strategy in order to solve the problem. In the experiment the participants had to use the Internet, S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 493 so the main source was given. It turned out that the participants did not express their search strategy on forehand while thinking aloud. However, from the participantsÕ actions on the Internet the used search strategy could be deduced. These search strategies were scored as belonging to particular search patterns. Furthermore, in the data a clear distinction could be made between searching information and scanning information. While searching the Internet, participants often scanned sites before processing the information in more detail or depth. As a result of these findings the protocols were scored on the following reformulation of main skills: Ôdefine problemÕ, Ôsearch informationÕ, Ôscan informationÕ, Ôprocess informationÕ and, Ôorganize and present informationÕ. For scoring the protocols three kinds of codes were used: descriptive codes, interpretative codes, and pattern codes (Miles & Huberman, 1994). Descriptive codes entail little interpretation and can be attributed to segments of the text in a straightforward way. Interpretative codes require more interpretation by the rater. Pattern codes are even more inferential and explanatory. They signal themes that account ‘‘. . .for a lot of other data, make them intelligible, and function like a statistical ÔfactorÕ, grouping disparate pieces into one more inclusive and meaningful whole’’ (Miles & Huberman, 1994, p. 58). The scoring system itself consisted of three types of categories, organized in three columns that were scored simultaneously. The first and second column pertained to the reformulated main skills and their sub skills. In the first column, the five main skills of information problem solving were scored in an exclusive and exhaustive way. In the second column, the categories representing the sub skills were scored. Each main skill was refined into several sub skills that could only be scored as sub skill of the main skill. For instance, the skill Ôdefine problemÕ consisted of four sub skills: (a) read the task, (b) explain or concretize the problem, (c) activate prior knowledge, and (d) determine the task requirements. In the third column regulation of the process, and eight pre-defined pattern codes were scored. These categories could be scored independently of the scoring in both other columns. Regulation of the process included (a) monitoring and steering of oneÕs own working process, (b) orientation on the process, and (c) testing of the results during and after the process. Each of these regulation components was divided in sub components. For instance orientation on the process existed of: orientation on time, task orientation, and orientation on the defined problem. Two of the eight pattern codes were related to the moment of deciding that the information was sufficient for completing the task: (a) the first pattern indicated that a participant searched a lot of information first and decided later on that sufficient information had been found, and (b) the second pattern indicated that after a short period of searching, the student decided that sufficient information had been found. Three pattern codes concerned the way the participants searched the Internet and were based on research of Carroll (1999): (a) meandering: starting from a list with results and surfing from site to site using hyperlinks, (b) browsing subject categories or databases: starting a search from a structured site and finding the information needed through refining, and (c) list link: going to a site by using a results overview of a content based search (usually from a search engine), returning to that overview 494 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 and going to a new site, and so forth. The final three pattern codes were based on research of Marchionini (1995) and were related to the search strategy used. Again there were three possibilities: (a) a goal-oriented approach: participants seek information in the context of a goal, hypothesis, or question, (b) a data-oriented approach: participants identify broad subject areas, conduct a search and read information on a topic and formulate the goals, hypothesis or questions from the resources, and (c) an opportunistic approach, participants begin with an entry point and proceed according to what occurs along the way. 2.2.3. Instrument for scoring the patterns After scoring the pattern codes using the scoring system, the raters scored the eight patterns again using a five-point Likert scale. By using this scale an overall picture of the used search strategies was obtained. A score of 1 was given when a pattern did not occur, and a score of 5 was given when a pattern was obvious. 2.2.4. Instrument for scoring the products A rating form has been developed to assess the final products (i.e., the 400-word arguments) of the experts and the novices. The form consisted of 18 items. The items were classified into four categories: the structure of the argument (five items), the quality of the content (eight items), the style of writing (four items), and the layout (one item). Sixteen items were scored using a five-point Likert scale and two items were scored using the categories yes or no. Examples of items are: Is there a title? (yes/no) and The problem is clearly formulated in the introduction (from 1 = totally disagree, to 5 = totally agree). 2.3. Design and procedure Sessions were held in the Multimedia Laboratory of the Open University of the Netherlands. At the beginning of each individual session, the participant was informed by the experimenter about the purpose and procedure of the session, and on what thinking aloud involved. The participant also read the task and could ask questions about the task (10 min). After the experimenter had left the room, the participant had one-and-a-half hour to complete the task. During this time the participant used the Internet to search for information and Microsoft Word to write the argument and present the information. During the session all computer actions, including Internet use, and the thinking aloud expressions of the participant were recorded on digital video. The experimenter watched the participant through a oneway screen and could communicate with the participant by microphone, and mainly encouraged the participant to keep on thinking aloud and answered questions if there were any. All tapes were typed-out into protocols. 2.4. Data analyses Two trained raters scored the protocols and the video registrations by using the coding system. Both raters scored 6 of the 10 protocols. The inter-rater reliability S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 495 Table 1 Inter-rater reliability (CohenÕs j) on the main skills and on regulation Protocol Define problem Search information Scan information Process information Organize and present information Regulation Total 1 2 3 4 5 6 1.0 0.42 0.50 1.0 1.0 1.0 0.58 0.75 0.83 0.43 0.80 0.70 0.74 0.58 0.60 0.71 0.55 0.58 0.69 0.24 0.64 0.59 Not scored Not scored 0.63 0.58 0.62 0.74 0.68 0.69 0.62 0.60 0.67 0.66 0.59 0.66 0.69 0.59 0.74 0.67 0.76 0.72 Total 0.64 0.72 0.63 0.63 0.66 0.63 0.70 (CohenÕs j) was calculated for these protocols and the raters reached consensus on the statements they disagreed on. Only one rater scored the remaining four protocols. Table 1 gives an overview of the inter-rater reliabilities on the main skills, sub skills, and regulation. The same two raters scored the search patterns using the five-point Likert scale and scored the quality of the products. The inter-rater reliability of the scoring instrument was 0.51 for the search patterns and 0.42 for the product (Cohens j) The raters reached consensus on both the search patterns and the products. One-way ANOVAs were used to analyze the differences between the experts and the novices on the use of main skills and their sub skills. For analyzing differences with regard to the search patterns and the quality of the products the non-parametric Kruskal Wallis test was used. 3. Results The first research question concerned the decomposition of the information problem-solving skill. As mentioned in the Method section, adjustments to the preliminary model have been made during the development of the coding system and while analyzing the protocols. In this section, the main skills and sub skills will be further identified and described. Identification and description are the first two functions of the skill decomposition. The other two functions, classification and sequencing, will be addressed in Section 4. Fig. 2 gives an overview of the observed main skills and sub skills. The process of information problem solving consists of five main skills and a regulation skill. These skills are broken down into sub skills. The skills define the information problem, search information, scan information and process information can be seen as the analysis part of the process. Organize and present information can be seen as the synthesis part of the process. The main skill define the information problem will always be performed at the beginning of the process. In order to get a clear insight in the problem a good performance of the sub skills is required. The task must be accurately read and the 496 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 REGULATION: orientation, monitoring and steering, testing Define the information problem Read task Search information Scan information Internet skills Process information Organize and present information Read text Formulate problem Concretize problem Derive search terms Scan site Elaborate on content Outline the product Activate prior knowledge Judge search result Elaborate on content Judge proccessed information Structure the product Clarify task requirements Judge scanned information Formulate text Elaborate on content Fig. 2. Skill decomposition of the information problem-solving skill. problem must be concretized in terms of a well-formulated question accompanied by sub questions. Concretize the problem also includes the formulation of the needed information. Prior knowledge must be activated in order to specify the information that must be searched for. Also, the task requirements must be clear. The aim of the main skill search information is that a person selects important or interesting sources and gets an overview of the search results. Different sub skills can be distinguished within this main skill. First of all Internet skills are required. These Internet skills are not further described in this study. Furthermore, it is important to derive the right search terms and to judge the search results on quality, relevance and reliability. The criteria for this judgment depend on the defined problem and the information that is needed. The goal of the main skill scan information is to scan and judge the information on quality and relevance in order to decide whether or not the information must be linked to the given problem (in the light of the requirements of the final product). Criteria for judging can be derived from the defined problem. While scanning the information, an elaboration on the content will take place. The new information will be combined with prior knowledge or with other found information. So, the important sub skills are, besides the necessary Internet skills: scan site, judge scanned information, and elaborate on content. S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 497 The main skill process information refers to deep processing, as opposed to scanning, of information. Reading, elaborating on the content, and judging the processed information are the observed sub skills. Activities such as analyzing, selecting, and structuring information are important for elaboration. Again the judgment of the usefulness and quality of the information by using defined criteria is important. So, the ultimate aim is comprehension of the information, that is, reaching an integration of the different pieces of found information and relevant prior knowledge so that the information problem can be solved. The main skill organize and present information refers to making the product as required in the task. In the present study students had to write an argument. A writing task is one type of task; there are other types of products possible, such as a presentation or a poster. It is assumed that the sub skills observed are also applicable in other settings. While working on the product relevant sub skills are formulate the problem, outline the product, structure the product, formulate the text and elaborate on the content. The problem that forms the basis of the to-be-developed product must be clearly formulated – no matter what product is precisely required. The product must also be outlined, meaning that the main set-up or layout must be determined. The different components of the product (defined in the outline) must be further structured and filled in. During this process elaboration of the content takes place. While executing all these skills in order to solve the information problem, regulation activities take place continuously. Task performers must constantly keep track of their ongoing process, monitor and steer their performance, orient on the task, manage the time, test on content and quality, and evaluate product and process. As a consequence the process will be iterative. The problem-solving processes of experts and novices are compared to gather more information on the sub skills and to develop guidelines for instruction in information problem solving. First, the differences in time investments in different skills will be described. Second, differences in the frequency of performing the main skills and the sub skills are presented. Third, the differences in the search patterns will be discussed. Fourth, the differences in regulation activities are reported. And finally, the differences in the quality of the analyzed products will be described. 3.1. Time investment in the different skills The average time experts spent in order to complete the task was 91.7 min (SD = 6.46). The average time spent by the novices was 71.6 min (SD = 20.06). This difference is marginally significant at the 10% level, F(1,9) = 4.47, MSE = 225.88, p < 0.10. Note that the time spend by the experts was the maximum available time. So, a ceiling effect occurred. Table 2 shows the means and standard deviations of the time investment in the main skills by the novices and the experts. The time investment in this Table is given in percentages because the total time investment differed somewhat between the novices and the experts. 498 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 Table 2 Differences in time invested in the main skills between experts and novices in percentage of time Experts (n = 5) Define problem* Search information Scan information Process information Organize and present information * Novices (n = 5) M SD M SD 3.32 18.35 24.01 12.10 42.95 2.08 8.39 7.39 10.43 9.29 0.62 20.10 25.70 2.68 50.73 0.41 6.70 15.70 4.20 15.19 p < 0.05. Only the percentage of time spent on defining the problem was significantly different between the experts (M = 3.32, SD = 2.08) and the novices (M = 0.62, SD = 0.41), F(1,9) = 8.05, MSE = 1.81, p < 0.05. 3.2. Differences in the frequency of use of the main skills and sub skills Table 3 presents the means and the standard deviations for the number of times that a particular main skill or sub skill was performed by the experts and the novices. One-way ANOVAs were used to determine whether there were differences between the experts and the novices. For all participants the main skill Ôdefine problemÕ occurred only once, at the beginning of the task. When students looked back into the task description during their performance of the task and, for instance, took notice of the task requirements, this was scored as orientation on the task (a sub skill of the regulation variable ÔorientationÕ) and not as define problem. With regard to the sub skills of this main skill, it appears that the experts read the task more often than the novices, F(1,9) = 7.54, MSE = 0.65, p < 0.05, and that they also activate their prior knowledge more often, F(1,9) = 6.00,MSE = 0.15, p < 0.05. Experts and novices show no differences for the main skill Ôsearch informationÕ or its sub skills. Internet skills and search strategies are important for this main skill as well as the main skill Ôscan informationÕ. To determine if the Internet skills of the experts and novices differed from each other, frequently used Internet actions have been counted, namely, the use of a search engine, the use of the address bar for typing a URL, the selection of sites, the use of links, the use of an internal search engine, and the use of the possibility to switch between windows. No significant differences were found between the experts and the novices for these actions. For the main skill Ôscan informationÕ, two differences were found for its sub skills. Compared to the novices the experts did elaborate more often on the content, F(1,9) = 6.00, MSE = 3.75, p < 0.05, and there was a tendency for the experts to judge the information more often, F(1,9) = 4.03, MSE = 23.85, p < 0.10. The judg- S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 499 Table 3 Means and standard deviations for the number of times that main skills and sub skills are performed by experts and novices Main skills and sub skills Experts M Analysis Define problem Read task* Concretize problem Activate prior knowledge* Clarify task requirements Novices SD M SD 1.00 3.00 0.40 0.60 0.20 – 1.00 0.89 0.55 0.45 1.00 1.60 0.00 0.00 0.00 – 0.55 – – – Search information Derive search terms Judge search results 10.60 7.20 15.40 5.28 4.02 9.56 13.00 9.20 18.20 9.00 8.90 15.40 Scan information Scan site Judge scanned informationa Elaborate on content* 13.60 29.60 22.00 5.00 5.41 9.99 5.10 2.00 12.60 23.60 14.80 2.00 6.79 13.86 4.66 1.87 Process information* Read Elaborate on content Judge processed information 5.80 10.00 7.40 10.20 4.09 9.67 7.92 13.14 1.20 2.20 2.80 3.40 1.64 2.86 5.72 5.98 Synthesis Organize and present information Formulate problem* Outline the product Structure the product Formulate text Elaborate on content 2.80 1.00 2.20 11.20 10.40 8.40 2.17 0.77 1.30 7.79 2.97 4.04 5.00 0.00 1.80 17.80 10.60 7.60 4.24 – 2.17 14.69 6.35 7.37 a * p < 0.10. p < 0.05. ment concerned the quality, relevance and the reliability of the information. No qualitative analysis is done regarding this aspect. The experts processed information more often than novices, F(1, 9) = 5.45, MSE = 9.70, p < 0.05. As can be seen in Table 2, the experts spent also more time on the main skill Ôprocess informationÕ (experts and novices spend, in order, 12% and 2.68% of their time to this). However, due to the high standard deviation in the expert condition this difference was not significant. So, probably some experts had more short moments of processing information while novices had less but longer moments of processing information. Furthermore, no significant differences were found for the three sub skills of the main skill Ôprocess informationÕ. Together, the above-mentioned main skills and sub skills form the analysis part of the information problem-solving skill. The synthesis part consists of the main skill Ôorganize and present informationÕ. A significant difference between the experts 500 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 Table 4 Means and standard deviations for the experts and the novices on the regulation variables Experts (n = 5) Monitoring/steering* Orientation Testing Total regulationa a * Novices (n = 5) M SD M SD 21.60 10.40 4.00 36.00 6.5 7.60 2.92 16.34 10.40 5.60 3.00 19.00 7.06 2.70 3.16 11.77 p < 0.10. p < 0.05. and the novices was found for one of its sub skills; the experts performed the skill Ôformulate problemÕ more often than the novices, F(1,9) = 10.00, MSE = 0.25, p < 0.05. 3.3. Regulation Table 4 presents the means and standard deviations for the experts and the novices on the regulation variables, that is, monitoring and steering, orientation, and testing. With regard to total regulation, the experts regulated their process slightly more often than the novices, F(1,9) = 3.56, MSE = 202.75, p < 0.10. During task performance, the experts monitored and steered their process more frequent than the novices, F(1,9) = 6.81, MSE = 46.05, p < 0.05. No significant differences were found on the variables ÔorientationÕ and ÔtestingÕ. These two variables were composed of sub variables. For the sub variable Ôorientation on timeÕ, the experts (M = 3.40, SD = 2.70) managed their time more often than the novices (M = 0.40, SD = 0.55), F(1,9) = 5.92, MSE = 3.80, p < 0.05. There were no significant differences for the sub variables Ôorientation on taskÕ, Ôtesting the completeness of the productÕ, and Ôtesting the quality of the productÕ. 3.4. Search patterns Table 5 presents the means and standard deviations for expertsÕ and novicesÕ use of different search patterns. No differences appeared between the experts and the novices with regard to the moment they decided that the gathered information was sufficient for completing the task. Looking to the means it seems that experts and novices searched the Internet in the same way: both groups used the meandering strategy less often and mainly searched by using the list-link approach. However, a trend can be seen in the use of the list-link approach. Experts appear to use this approach more often than the novices (v2(1, N = 10) = 3.41, p < 0.10). And finally, there was no marked difference in the search strategies used: both groups mainly used the goal-oriented approach. S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 501 Table 5 Means and standard deviations for the experts and the novices on the search patterns Experts (n = 5) Novices (n = 5) M SD M SD Decision if information is sufficient Late in process Early in process 3.40 3.00 1.34 1.14 3.20 3.60 1.30 1.41 Way of searching the Internet Meandering Browsing subject categories List linka 2.20 2.40 4.80 0.45 0.89 0.45 2.40 3.20 4.00 0.89 1.30 0.71 Search strategies Goal-oriented Data-oriented Opportunistic 4.00 3.00 2.60 0.00 0.00 0.89 3.60 2.80 3.20 0.55 0.84 1.10 a p < 0.10. Table 6 Means and standard deviations for the experts and the novices on the categories for assessing the products Quality criteria Experts (n = 5) Structure of the argument Quality of the content Style of writing* Lay-out* 3.87 3.80 3.23 4.20 * Novices (n = 5) 0.77 0.36 0.43 0.37 2.29 3.15 2.29 2.60 1.40 1.01 0.72 0.40 p < 0.05. 3.5. Quality of end products Table 6 presents the means and standard deviations for the novices and the experts on the quality criteria for assessing the end products, that is, the 400-word arguments. The experts scored significantly higher than the novices on the quality criteria Ôstyle of writingÕ (v2(1, N = 10) = 4.42, p < 0.05) and ÔlayoutÕ (v2(1, N = 10) = 5.55, p < 0.05). For the categories Ôstructure of the argumentÕ and Ôquality of the contentÕ no significant differences were found. Two items were scored with yes or no. Whereas all the experts (100%) used a title for their argumentation, only 60% of the novices did formulate a title. However, this difference is not significant. Compared with the novices the experts referred significantly more often to the sources they used; 80% of the experts did, but none of the novices mentioned used sources (v2(1, N = 10) = 6.00, p < 0.05). 4. Discussion The aim of this study was to decompose the complex cognitive skill of information problem solving, by observing experts and novices who performed an authentic 502 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 information problem-solving task. The task is representative for the information problem-solving tasks that higher education students will be confronted with in their professional or daily life. The task could be characterized as a study task, which required the students to develop a writing product. In studentsÕ school career and in their future job career writing products (argumentations, reports, essays, articles) are often required. In order to detail out the skill decomposition and to develop guidelines for instruction in solving information problems, expert-novice differences were determined with respect to: (1) time investment in the main skills, (2) frequency of use of main skills and sub skills, (3) regulation activities, (4) the occurrence of search patterns concerning the moment of deciding if sufficient information is gathered, the way the Internet is searched, and the search strategies used, and (5) the quality of the end products. The skill decomposition was based on a preliminary model and the thinking-aloud data. Observations resulted in adjustments and additions to the preliminary model. The categories Ôselect sourcesÕ and Ôsearch and findÕ in the model were replaced by the new skills Ôsearch informationÕ and Ôscan informationÕ. In the protocols no statements were found concerning source selection on forehand. This selection can also be seen as a component of Ôsearch informationÕ. Also other researchers (Boekhorst, 2000) do not distinguish source selection as a main skill or main component of the process of information problem solving. The skill Ôscan informationÕ was originally part of the skill Ôsearch and findÕ, but in our data a clear distinction could be made between search actions and scan actions. So in the original model those stages were more interwoven. In the skill decomposition skills and sub skills are defined. In order to come to more sophisticated guidelines for instruction it is recommendable to gain more insight in the most important skills or sub skills, or skills that really make a difference. The expert-novice analysis gave more insight in which skills need more attention and need to be further analyzed. The comparison between the experts and the novices indicates some interesting differences between the two groups. First, the experts spent some more time on the whole information problem-solving task than the novices. Actually, the experts used all the time that was available (i.e., one-and-a-half hour), indicating that the difference with the novices might have been even larger would there have been no time limit. In particular, the experts spend more time than the novices on defining the problem in the beginning of the process. During this phase, the experts also read the task more often and activated their relevant prior knowledge more often than the novices. This is in agreement with the problem-solving literature, which indicates that experts typically pay more attention to the analysis of problems (and the evaluation of solutions) than novices (Land & Greene, 2000). With regard to the frequency of use of different skills, differences between experts and novices were found for scanning, processing, and organizing and presenting information. While scanning the information, the experts elaborated more often on the content than the novices, and there was also a tendency for the experts to judge the quality and relevance of the information and the reliability of the sources S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 503 more often. The experts also processed information more often than the novices. Furthermore, both experts and novices invested a lot of time in organizing and presenting the information, that is, in writing the argumentation. The main difference between the experts and the novices is that experts pay frequent attention to the (re)formulation of the problem while this is completely ignored by novices. With regard to the regulation of the problem-solving process, regulation activities were somewhat more frequent for experts than for novices. In particular, the experts showed more monitoring and steering activities during task performance and oriented themselves more often on the time left to accomplish the task. In the literature it is often stated that regulation is associated with good strategy use and with a more effective and efficient process (e.g., Land & Greene, 2000; Moore, 1995). Although the experts regulated their process more often, there is no indication that their process was more efficient, because they spend the maximum amount of time and there is no hard evidence that their writing products were of a higher quality than those of the novices. This may be due to the fact that all participants frequently used the Ôcut and pasteÕ function, so that text fragments from the Internet were directly included in the argument. However, experts had a better style of writing than the novices, used a more effective layout for their argument, and more often referred to the sources that supported their argumentation. Experts also wrote better connections between those text fragments than the novices. With respect to the search patterns, not much differences occurred between the experts and the novices. It seems that all participants searched the Internet in the same way. They all used the list-link approach most of the time, although the experts used this approach more often. The experts and novices did not differ either in their use of a goal-driven, data-driven, or opportunistic approach. The fact that no differences between experts and novices were found in the use of the data-driven and the goal-driven approach may be due to the characteristics of the information problemsolving task, which was quite open. For example, there were no restrictions on the content of the argument. Therefore, a combination of both approaches could be considered an appropriate approach for completing the task. While searching the Internet, participants came up with interesting information and decided to use this information in their argumentation. This is in line with the data-driven approach. But participants also often used the found information for adjusting and changing their preconceived plan, which is in line with the goal-driven approach. What must be noted is that the opportunistic approach occurred more often than expected. So, the raters observed in both groups that the participants occasionally surfed from one site to the other. This approach may be observed due to the shifts participants made between the goal-driven and data-driven approach. The raters may have interpreted these shifts as the use of an opportunistic approach. The results from this research gave us insight in the process of information problem solving by experts and novices. However, some remarks are in place. First, the small number of participants limits the power of our study. For instance, a larger number of participants would make it possible to use extreme groups and so maximize the differences between novices and experts. In the current study, there was one novice who had certain characteristics of an expert. Compared to the other four 504 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 novices, she showed a lot of deep processing of information. A second remark concerns the fact that participants used the Internet to gather information. An alternative would be that students go to a library or other information center to search for information. Library skills instead of Internet skills are then needed in order to find information. It could be argued that the process of information problem solving has different characteristics in both situations, which may have consequences for the way people solve information problems (e.g., not preparing a preconceived plan for searching the Internet, because this is so much faster than searching in a library). Although the same stages in the process can be recognized in both situations, the way people switch between those stages and the relative emphasis on different skills will probably be different. With these remarks in mind, guidelines for teaching information problem-solving skills can be formulated. 4.1. Instructional guidelines Table 7 presents an overview of the instructional guidelines discussed in this section. These guidelines can be seen as a result of the third and fourth function of the skill decomposition, namely, classification of skills and making a macro-level sequence according to which the skills will be dealt with in the instructional program (Van Merriënboer, 1997). For classification, the main question is which skills must or must not be taught. It is possible that, depending on the target group, certain skills do not need to be trained because students already master the skills or because the skills are a prerequisite for the course. Looking at the sub skills of solving information problems, one may decide not to provide training in Internet skills. For those skills, no difference between experts and novices was found and it seems that the patterns for searching the Web (using a Web-browser and a search-engine) are unrelated to expertise. If students enter an information problem-solving course without having any Internet skills, it seems plausible to train these skills beforehand. However, it should be ensured during the information problem-solving training that transfer of the learned Internet skills occurs. Table 7 Overview of guidelines for instruction in the information problem-solving skill Instructional guidelines 1. Decide which sub skills are and are not necessary to train 2. Use a whole-task approach, starting with simple tasks and increasing task complexity when learners acquire more expertise 3. Train important sub skills: Define the problem Judge the reliability of the sources and the quality and relevance of the information Deep processing of the gathered information 4. Pay attention to the regulation of the process, for instance, by using the cognitive apprenticeship approach 5. Train the skill of information problem solving in different domains in order to stimulate transfer S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 505 The next question that arises is in which sequence the skills must be dealt with in the instructional program. The results of our study show that the main skills and sub skills involved in information problem solving are highly interrelated, and performed in an iterative fashion. Experts frequently switch between skills, the result accomplished by one sub skill is often the input for another sub skill, and the coordination between skills is very important. Therefore, a whole-task approach is recommended for teaching information problem-solving skills. In such an approach, students start with very simple versions of the whole task, enabling them to learn to coordinate and integrate the different skills involved. Within this approach, it is advisable to make a sequence that emphasizes particular main skills or clusters of sub skills in different phases of the training (i.e., an emphasis manipulation approach; Van Merriënboer, 1997). Looking at the expert-novice analysis, important skills to emphasize in the instructional program are: (1) define the information problem, (2) process information, (3) judge the quality of the information, and (4) regulation. Instruction on defining the information problem should emphasize reading the task description and analyzing the problem in main questions and sub questions, activating relevant prior knowledge, and gaining insight in what information is needed to answer the questions. Experts also (re)formulate the problem when they are working on the presentation of the information, indicating that it may be important to outline the link between the information questions and the final product or task requirements in the instruction. Finally, experts spent a little more time on the whole information problem-solving task, and especially on the problem definition, than novices. It may thus be desirable to point out to students that solving information problems and, especially, clearly defining the problem is a time consuming task and that they should take all the time they need. With regard to processing information, elaboration and in-depth processing is required in order to be able to produce a high-quality end product. Working together while solving an information problem (i.e., collaborative problem solving) can stimulate elaboration and enhance studentsÕ understanding of the topic. Successful elaboration during the scanning of information also requires students to keep the information questions in mind. Another important aspect pertains to the judgment of the quality and relevance of the information and the reliability of the sources in which the information is found. What are the criteria for judging the quality of the information? These criteria may be partly generic and partly domain specific. Teachers play an important role in helping students with the specification of criteria. Websites supporting studentsÕ information problem-solving process often refer students to their teacher when they have difficulties with judging the quality of the sources and the information. But are teachers aware of the criteria they use and could teach? Research of Moore (1997) reveals that teachers are poorly equipped to access the world of information. So it seems important to gain more insight in generic and domain specific criteria for judging information. Research of Duijkers, Gulikers-Dinjens, and Boshuizen (2001) reveals that students do use criteria to judge sources and information, but that they are not always aware of using them. The reported criteria by the students in that 506 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 study were mainly related to the content and to the defined problem. Other reported criteria include comprehensibility and completeness of the information. A fourth aspect that deserves special attention during the training of information problem-solving skills is regulation. Students must constantly monitor and steer their information problem-solving process. Questions like ÔIs this the information I need?Õ, ÔAm I still working towards an answer on my information question?Õ, ÔIs it necessary to use other search terms?Õ ÔWhat were the task demands?Õ and, ÔHow much time do I have left?Õ must be asked and answered frequently. Cognitive apprenticeship (Collins, Brown, & Newman, 1989) may offer an effective approach to foster studentsÕ regulation activities. In this approach students learn how to regulate their problem-solving process through the combination of observation, guidance and practice, or from the teacherÕs point of view, through modeling, coaching and fading. Cognitive apprenticeship intends to bring internal (expert) cognitive processes out in the open. The externalization of these processes can be accomplished through modeling or demonstration, discussion, alternation of teacher and learner roles (reciprocal learning), and co-operative learning. Through dialogues the processes become explicit, so that students can gradually internalize them. It is clear that this is not a matter of blind imitation or direct strategy instruction. Instead, the metacognitive knowledge and studentsÕ awareness of the cognitive processes are given attention: for instance, what strategies are available, how do they function, when should they be applied, and (why) are they effective? Such metacognitive knowledge and awareness is critical for students to be able to control and regulate their information problemsolving processes (see also Lazonder, 2003). A final important topic concerns the transferability of the information problemsolving skill. In order to stimulate transfer to new situations, Van Merriënboer (1997) argues that complex cognitive skills should be trained in as many domains as possible. In addition to such variation in training situations, it should be made explicit to students that a skill that works in one domain may also work, or may not work, in another domain. It should thus be determined which features of the complex skill are generic and which aspects are more domain specific. This relates to the mindful abstraction and de-contextualization of general principles underlying the performance of a skill, so that it becomes available in a new situation. In the literature this mechanism is referred to as Õthe high road of transferÕ (Perkins & Salomon, 1989). Future research, in which experimental and control settings for the teaching of information problem-solving skills are used, must give more insight in the effectiveness of these guidelines for different kinds of students. References Bawden, D. (2001). Information and digital literacies: a review of concepts. Journal of Documentation, 57, 218–259. Boekhorst, A. K. (2000). Informatievaardig worden in het onderwijs, een informatiewetenschappelijk perspectief: Een vergelijkende gevallenstudie in Nederland en Zuid-Afrika [Becoming information literate S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 507 in education, an information science perspective: a comparative case study in The Netherlands and South Africa]. Unpublished dissertation. Retrieved January, 23, 2002. Available from: www.hum.uva.nl/~albert/public/prom-akb-tot.PDF. Carroll, J. B. (1999). Expert Internet information access. Journal of Educational Computing Research, 20, 209–222. Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: teaching the draft of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and instruction. Essays in honor of Robert Glaser. Hillsdale, NJ: Lawrence Erlbaum. De Jong, F. P. C. M. (1992). Zelfstandig leren: Regulatie van het leerproces en leren reguleren: een procesbenadering [Independent learning: Regulation of the learning process and learning to regulate: a process view]. Unpublished doctoral dissertation, Tilburg University, The Netherlands. De Jong, F. P. C. M., & Simons, P. R. J. (1988). Self regulation in text processing. European Journal of Psychology of Education, 3, 177–190. Dochy, F. J. R. C. (1993). De invloed van voorkennis op leerresultaten en het leerproces [The effect of prior knowledge on learning outcomes and process]. In W. Tomic & P. Span (Eds.), Onderwijspsychologie: Beı̈nvloeding, verloop en resultaten van leerprocessen (pp. 97–119). Utrecht, The Netherlands: Lemma. Duijkers, H. M., Gulikers-Dinjens, M. T. H., & Boshuizen, H. P. A. (2001). Begeleiden van leerlingen bij het zoeken, selecteren en beoordelen van informatie [Guiding studentsÕ processes of searching, selecting and judging information]. In: Handboek Studiehuis Tweede Fase. Alphen a/d Rijn, The Netherlands: Samsom. Eisenberg, M. B. (2003). A Big6TM skills overview. Retrieved July 7, 2003. Available from: www.big6.com/ showarticle.php?id=16. Eisenberg, M. B., & Berkowitz, R. E. (1990). Information problem-solving: The big six skills approach to library and information skills instruction. Norwood, NJ: Ablex. Eisenberg, M. B., & Berkowitz, R. E. (1992). Information problem-solving: the big six skills approach. School Library Media Activities Monthly, 8(5), 27–29, 37, 42. Eisenberg, M. B., & Berkowitz, R. E. (2000). The BIG6 collection: The best of the Big6 newsletter. Worthington, OH: Linworth. Eisenberg, M. B. & Johnson, D. (2002). Learning and teaching information technology. Syracuse, NY: ERIC Clearinghouse on Information & Technology (No. ED465377). Retrieved July 7, 2003. Available from: www.ericit.org/digests/EDO-IR-2002-04.pdf. Ellis, D., Cox, D., & Hall, K. (1993). A comparison of the information seeking patterns of researchers in the physical and social sciences. Journal of Documentation, 49, 356–369. Hill, J. R. (1999). A conceptual framework for understanding information seeking in open-ended information services. Educational Technology, Research and Development, 47(1), 5–27. Hill, J. R., & Hannafin, M. J. (1997). Cognitive strategies and learning from the World Wide Web. Educational Technology, Research and Development, 45(4), 37–64. Hill, J. R., & Hannafin, M. J. (2001). Teaching and learning in digital environments: the resurgence of resource-based learning. Educational Technology, Research and Development, 49(3), 37–52. Hölscher, C., & Strube, G. (2000). Web search behavior of internet experts and newbies. Computer Networks, 33, 337–346. Ingwersen, P. (1996). Cognitive perspectives of information retrieval interaction: elements of a cognitive IR theory. Journal of Documentation, 52, 3–50. Irving, A. (1985). Study and information skills across the curriculum. London: Heinemann Educational Books. Kirschner, P. A., Valcke, M. M. A., & Van Vilsteren, P. (1997). Business game learning environment: design and development of a competency-based distance educational business curriculum at the Open Universiteit. Distance Education, 18(1), 153–177. Kuhltau, C. (1993). Seeking meaning: A process approach to library and information services. Greenwich, CT: Ablex. Land, S. M., & Greene, B. A. (2000). Project-based learning with the World Wide Web: a qualitative study of resource integration. Educational Technology, Research and Development, 48(1), 45–68. 508 S. Brand-Gruwel et al. / Computers in Human Behavior 21 (2005) 487–508 Lazonder, A. W. (2000). Exploring novice usersÕ training needs in searching information on the WWW. Journal of Computer Assisted Learning, 16, 326–335. Lazonder, A. W. (2003). Principles for designing web searching instruction. Education and Information Technologies, 8, 179–193. Macdonald, J., Heap, N., & Mason, N. (2001). Have I learnt it?: Evaluating skills for resource-based study using electronic resources. British Journal of Educational Technology, 32, 419–433. MacKenzie, J. (1994). Grazing the Net: raising a generation of free range students – part one. From Now On: The Educational Technology Journal. Retrieved July 11, 2003. Available from: www.fno.org/ grazing1.html. Marchionini, G. (1995). Information-seeking in electronic environments. New York: Cambridge University. Marchionini, G. (1999). Educating responsible citizens in the information society. Educational Technology, 39(2), 17–26. Miles, M. B., & Huberman, A. M. (1994). Qualitative data analyses: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage. Moore, P. (1995). Information problem solving: a wider view of library skills. Contemporary Educational Psychology, 20, 1–31. Moore, P. (1997). Teaching information problem solving in primary schools: an information literacy survey. Paper presented at the 63rd IFLE general conference, Copenhagen Denmark. Perkins, D. N., & Salomon, G. (1989). Are cognitive skills context-bound?. Educational Researcher, 18, 16–25. Reigeluth, C. M. (1999). What is instructional-design theory and how is it changing?. In C. M. Reigeluth (Ed.), Instructional-design theories and models. A new paradigm of instructional theory (Vol. 2, pp. 5–29). Mahwah, NJ: Lawrence Erlbaum Associates. Savery, J. R., & Duffy, T. M. (1995). Problem based learning: an instructional model and its constructivist framework. Educational Technology, 35(5), 31–38. Schmeck, R. R., & Geisler-Brenstein, E. (1989). Individual differences that affect the way students approach learning. Learning and Individual Differences, 1, 85–124. Shapiro, J. J., & Hughes, S. K. (1996). March/april. Information literacy as a liberal art: enlightenment proposals for a new curriculum. Educom Review, 31(2). Retrieved December, 16, 2000. Available from: www.educause.edu/pub/er/review/reviewarticles/31231.html. Spink, A. (1997). Study of interactive feedback during mediated information retrieval. Journal of the American Society for Information Science, 48, 382–394. Spitzer, K. L. (2000). What every educator should know about information literacy. In M. B. Eisenberg & R. E. Berkowitz (Eds.), The Big6 collection: The best of the Big6 newsletter (pp. 3–13). Worthington, OH: Linworth. Stripling, B., & Pitts, J. (1988). Brainstorms and blueprints: Teaching library research as a thinking process. Littleton, CO: Libraries Unlimited. Sutcliffe, A., & Ennis, M. (1998). Towards a cognitive theory of information retrieval. Interacting with Computers, 10, 321–351. Van Merriënboer, J. J. G. (1997). Training complex cognitive skills. Englewood Cliffs, NJ: Educational Technology. Vermunt, J. D. (1995). Process-oriented instruction in learning and thinking strategies. European Journal of Psychology of Education, 10, 325–349. Vermunt, J. D. (1998). The regulation of constructive learning processes. British Journal of Educational Psychology, 68, 149–171. Wilson, T. D. (1999). Models in information behaviour research. Journal of Documentation, 55(3), 249–270. Wolf, S. E., Brush, T., & Saye, J. (2003). Using an information problem-solving model as a metacognitive scaffold for multimedia-supported information-based problems. Journal of Research on Technology in Education, 35, 321–341. Zins, C. (2000). Succes, a structured search strategy: rationale, principles, and implications. Journal of the American Society for Information Science, 51, 1232–1247.