JBCS



2:17, dom nov 24

Acesso Aberto/TP




Educação

Studying student behavior and chemistry skill using browser-based tools and eye-tracking hardware#

Norbert J. Pienta*

Department of Chemistry, University of Georgia, Athens, GA 30602, USA

Recebido em 20/10/2016
Aceito em 31/12/2016
Publicado na web em 09/03/2017

Endereço para correspondência

*e-mail: npienta@uga.edu
#This paper was presented in the SBQ - ACS symposium on Chemical Education, held in Goiânia, on May 2016. Publication costs were assisted by INCT Inomat, CNPq process 573644/2008-0

RESUMO

Browser-based tools were created to collect quantitative data about university student problem-solving skills. Three of these tools have been described: a word question tool that creates ideal gas law and stoichiometry questions using a set of complexity factors; a Lewis structure drawing tool that enables the user to draw a solution to an ion or molecule assigned to them; and a "spheres" tool that uses spheres to represent atoms and molecules to denote the particulate nature of matter. Results from these studies show that relatively simple questions can be made very complex by the addition of many complexity factors that challenge the cognitive skills. The drawing tools can be used for instruction or to collect data about student understanding; the outcomes suggest that students with more instruction in chemistry are more successful but even the performance of students after four semesters is somewhat disappointing. Eye-tracking hardware enabled the study of the student use of the visual interface of the other tools and to study molecular representation or interpret spectral data.

Palavras-chave: Chemical education research, Student problem solving, Eye-tracking studies, Microscopic, macroscopic and symbolic representations, Student learning in introductory courses.

INTRODUCTION

One of the goals for introductory chemistry classes at the tertiary (i.e., university) level is to develop problem-solving skills within the students. The topic has been discussed in some detail over the years.1-4 In order to probe, evaluate, or measure those skills in students, a number of research approaches have been implemented; for example, the "think-aloud" protocol enables the subject to describe what they are thinking about and doing to a researcher, who is present.5 As an alternative, a series of browser-based tools were developed and implemented to examine cognitive issues related to solving word problems6-8 and two types of representations common to general chemistry-Lewis structures and using spheres to represent atoms in particulate nature of matter drawings. The web ware, written in Flash, generated each question algorithmically (i.e., after the student accessed and logged into the interface), provided the tools for the user to offer an answer, and tracked the user's activities while they answered the questions. Thus, any student with the URL of the website and valid login identification could work on the tools, allowing access in different classes or even different institutions. Furthermore, the user's "work" (i.e., activities and keyboard activity on the tool) could be captured, along with outcomes, like whether they answered the questions correctly. This allows for the accumulation and aggregation of student data from thousands of attempts. As a result, these tools represent a very useful and productive set of methods for quantitative research.

A second technology, eye-tracking hardware and software, is being used in research to monitor where users are focusing their visual attention in the use of the browser-based tools already described. In summary, the eye-tracker detects the dark pupil of the eye and provides the x, y, and z coordinates of each eye together with the size of the pupil. Because chemistry involves a range of symbolic types, including text, tables, figures, illustrations and charts, eye-tracking hardware and software provides an additional research tool to examine how a student is engaged with those items. Modern eye-trackers are often part of an LCD monitor that is connected to a computer that serves both as the source of material to be viewed but also the device that capture the viewer data. Alternatively, the eye tracking hardware can be contained in a pair of glasses or goggles, which also contains a video camera that enables the user to move around and extending the technique to actual objects like laboratory instrumentation. Although the basic captured data are the locations of the user's gaze, the length or duration of the gaze or a sequence of locations are also part of the method. Thus, the data can be used to track a user's path of locations or gaze duration, among others. Certain behaviors like gaze duration7-8 and physiological responses like pupil dilation9 or heart rate9,10 have been correlated with cognitive challenge or difficulty. For studies described herein, statistical comparison was made using the eye-tracker data to differentiate more and less successful users or the differences between experts and novices. The eye-tracking experiments have also been applied to different representations of molecules (i.e., lines-and-letters versus ball-and-stick)11 or to the interpretation of spectral data like proton NMR.12,13

 

EXPERIMENTAL

Three different programs were written in Flash for use on Internet browsers: (1) "Word Problems" that delivered a computer-generated word problem about the ideal gas law (i.e., given a volume at a temperature, what is the new volume at a new temperature) and a stoichiometry question that asks for a quantity of product or reactant given a quantity of a different product or reactant from an equation that was provided. (2) "Lewis Structures" that provided a group of drag-and-drop tools that allows atoms, electrons, bonds (i.e., lines) and charges to be arranged on a drawing area to represent the Lewis structure of one of 24 ions or compounds. (3) "Spheres" that provides drag-and-drop tools that allow the user to form compounds, molecules or ions in a drawing area using sphere to represent atoms.

Word problems

In the Word Problem tool, the software uses categories of variables and randomly assigns one variant of each variable. The variables and variants for the ideal gas law and stoichiometry questions are provided in Table 1.

 

 

This example of an ideal gas law question has the 1st gas identity "An ideal gas...", the first volume-number format, liters as the initial and final volume unit, both temperatures in K and a no value given (i.e., blank pressure units and value after the phrase "...maintained at a constant value..."):

"An ideal gas occupies an initial volume of 6.22 L at a temperature of 262 K. What is the final volume in units of L if the temperature is changed to 289.6 K while the pressure of the system is maintained at a constant value? Assume that no chemistry occurred and there is no change in the amount of material."

A similar, example question is generated from the variables for the stoichiometry question (i.e., alumina identity = blank; the 2nd equation option, a word equation; the general number format; and moles to moles as the quantity units):

"Synthetic aluminum oxide is formed by heating aluminum hydroxide, also forming water as a by-product. Determine how many moles of aluminum hydroxide can form from 5.18 moles of aluminum oxide."

The word problem tool interacts with a database that allows the user to log in using a unique identifier or an institutional one for an entire class. Each user attempt randomly generates the question; provides unique, random numerical values that match the assigned number format; saves all of the data; and tells the student whether they answered correctly when they submit their solution. The interface has a simple calculator so that the program can capture the student keystrokes. Analyses were conducted using logistic regression to determine which variables and which variants produced statistically significant outcomes.6-8

Lewis structure drawing tools

The interface for the Lewis structure drawing tool appears in Figure 1. The program assigns the user a structure that is given as a formula, including the correct charge for ions. The assigned structure in Figure 1 was the ammonium ion NH4+. The user drags atoms, electrons, lines (i.e., single, double or triple bonds) and charges from the palette at the lower right into the drawing area. The user in Figure 1 drew a partial structure in response to being assigned the ammonium ion NH4+. All components can be positioned anywhere on the drawing area, moved, or deleted. The tool captures the location of all the components in the drawing area, in addition to all of the actions taken by the user (i.e., adding, moving, deleting) and save the data to a database along with the login identity provided by the student.

 


Figure 1. The interface for the Lewis Structure Drawing tool

 

Spheres drawing tools

The interface for the spheres drawing tool appears in Figure 2. The drawing area for the reagents appears as the "Start" box on the left, while the products should be placed in the "Finish" box on the right. In the example an N and O sphere were dragged to form NO and two O spheres were dragged to form the O2. Nothing was added to the products window. When the user submits their drawing, they are guided through a series of questions and steps: Is the equation balanced? If not, they can enter the correct coefficients. Did you construct the representation for the balanced equation? If not, they are asked to draw it and submit it. Then they are given a set of conditions where one starting material is the limiting reagent and asked to draw it. The tool saves the information to reproduce all of the student drawings and their answers.

 


Figure 2. The interface for the Spheres Drawing Tool

 

Eye-Tracking Experiments

The eye-tracking experiments were conducted using a Tobii model X120 device, which is a 17-inch LCD monitor with the infrared transmitter and detector built into the frame of the device. A user undergoes a 30-second calibration in which they are asked to look at a set of locations on the screen in order to assign accurate locations of gaze. In the experiment, the word problem or the interactive tool are projected on the LCD monitor. For the word problem, the user can be reading the question, using a white board tool to write an equation or devise a strategy, or using a calculator to determine a quantitative value. The software allows the user to examine the path or sequence of gazes but can also plot an integrated measurement of where the user was looking. For example, one might be interested in the amount of time a user looked at the numerical value in the word problem or the units of that number. Thus, the accompanying software enables a variety of data visualization and analysis methods that a further described in the applications described.7-9

 

RESULTS AND DISCUSSION

Word problems

For the ideal gas law question, the 5 categories of variables (i.e., complexity factors) and their variants represent 432 unique questions. The tool was used to complete assignments in several courses at several universities: the first semester of a traditional, two-semester general chemistry sequence and a single semester preparatory class for students whose skills in chemistry warrant starting at a lower level. Over 3000 user attempts provided the data that was analyzed.6 Only three of the complexity factors yielded statistically significant results (p < 0.05): the scientific notation number format, the temperature units, and the volume units. The tool and its calculator used the format "1.23E6" rather than a power of 10 with a superscript. Although this format is common on simple calculators, students may be more familiar with seeing numbers with powers of 10, particularly on more complex, programmable calculators whose screens support formats with superscripts. The only temperature format that was statistically significant was the one where the two values were given is oC. For the case of converting temperature in K to a new temperature in K, one would expect few difficulties because those are the appropriate units the students use for the ideal gas law. Having a conversion from oC to K or from K to oC, likely provides a clue that one should convert Celsius to Kelvin. Students who were given both values in Celsius got the question wrong at a statistically significant rate. Either they made an error in converting one or more values, or more likely, they forgot to change to the other unit. The third statistically significant result, the apparent difficulty with volume units, is quite intriguing. Students did worse on all questions except where the initial and final volumes were given in liters. In fact, L to mL was the most difficult, followed by mL to mL and then mL to L. Because their calculator entries were saved, it was determined that ca. 20% of those who got the question wrong provided an answer that was different from the correct one by a factor of 1000. Another 20% of users apparently used the inverse of the correct mathematical relationship (V1/T1 = V2/T2). That there were significant differences among the volume unit changes associated with the wrong answers suggests that students don't routinely use dimensional analysis or at least they don't use it correctly.

At the beginning of these studies, an original research question was whether the word problem tool could create a set of questions that varied in the ability to test the short-term-memory or cognitive load of the users. The original report by Miller14 and subsequent work that appears in several reviews by Sweller, Merrienboer and Paas15-18 discusses cognitive load in terms of the items that can be stored in short term memory (i.e., typically 7). In the word problem tool, students are given two numerical values of temperature each with its unit, one numerical value for the initial volume but the potential for two different units for the initial and final states, spurious information in the ideal gas identifier, and a few other issues that might require use of these cognitive skills. Johnstone19 had previously described such an effect that was ascribed to memory load. Although these questions might be characterized as relatively simple exercises, these data show that there is sufficient complexity to significantly reduce student success.

Eye-tracking data was collected using students who were completing the ideal gas and stoichiometry questions.7,8 Each session in which a student completed a question was analyzed using the time spent by the students in reading, planning and calculating the solution to their question. When students who were more successful at completing the problems were compared with less successful students, statistically-significant differences were found in several of the different phases of their time spent. There was no difference in the time spent originally reading the problem. But there are differences in the planning phase and in the overall time spent. Thus, the less successful students took considerably more time in the planning phase and in the total time spent to complete their solution. When these data were collected for the stoichiometry questions, students were given an opportunity to "think aloud" while they worked; analysis of their comments suggests that the more successful students appear better organized and are more likely to have a plan for solving the problem.8

Lewis structures

Student data using the Lewis structure tool were obtained from three courses: (1) a one-semester preparatory course (for students whose secondary school background was not sufficient or whose program of study only required the single course); (2) the first semester of a two-semester sequence of general chemistry courses; and (3) the first semester of a two-semester sequence of organic chemistry. The prep chem data were collected at two different universities. The general chemistry data were collected at a single site over three different terms. The organic chemistry data were collected at a single site during two different terms. In all cases, the student attempts occurred during voluntary use of the tool to enable them to practice for an examination. Table 2 contains the 24 different structures that students were asked to draw and the percentage of correct answers from the three cohorts: prep chem (N = 699, 31.3% correct overall), general chem (N = 1016, 40.2% correct overall) and organic chem (N = 1407, 55.7% correct overall). The N values represent structures drawn not the number of unique students. Students were generally asked to complete 5 structures.

 

 

For most of the entries in Table 2, the percentage correct increases in the order preparatory chemistry < general chemistry < organic chemistry, a reasonable expectation based on the semesters of instruction in which Lewis structures would be used. Cooper and coworkers20-22 have used Lewis structures in studies related to structure-property relationships and as a measure of successful instruction concerning the structure of molecules and ions. In the studies described here, students were often given a fixed number of attempts, typically five. The software also tracks which attempt each student drawing represents. There is some evidence that students get somewhat better as they proceed through their attempts. However, a much larger dataset will be collected in order to examine the percentage correct for a given structure as a function of the attempt number.

A set of data for the errors made by the students is reported for nine representative structures in Table 3. These errors are provided by the Lewis tool software, which was written to provide some basic feedback to students about why structures were scored as incorrect.

Four such errors were detected by the software: "atoms", an error that points out the not all the correct atoms were used in the drawing or the incorrect atoms were used; "eCnt", the total electron count for the expected contributing atoms in the assigned structure, including the corrections when the structure is an ion; "eLoc", the location of bonding and nonbonding electrons; and "+/-" or whether a charge for an ion was included. In the table, some of the columns represent up to three of these errors occurring simultaneously in the student-drawn structure.

The structures in Table 3 represent a few sets of data for related compounds. Thus, the first three entries include boron-based structures: BF4-, BH3, and BH4-. Borane BH3 is the simplest structure with the highest success rate; almost all of the errors (i.e., 92%) come from an incorrect electron count, presumably because students assigned the structure an octet of electron around boron with a lone pair. In contrast, for borane BH4-, most errors were related to a charge. In contrast, the tetrafluoroborate ion BF4- derives 44% of the errors from electron location errors, typically from the placement of nonbonding pairs of electrons and/or from using the wrong number of electrons by omitting some nonbonding electrons. The three halogen-substituted methane structures also show the same problem with the placement of bonding and nonbonding electrons. The grading software expects to find the electrons within a certain distance for the center of an atom (i.e., 50 pixels); bonds and electrons placed at greater distances could be mistaken as being assigned to adjacent atoms. Thus, some of the nonbonding electrons may be present but are drawn at a considerable distance from the center of the appropriate atom. For the comparison between ammonia and ammonium ion, the greatest number of errors comes from a missing charge (i.e., 83%). When a student is using the Lewis structure tool, a set of drawing guidelines and instructions is accessible via a button on the drawing tool; those instructions point out that charges should be included for ions, but many students did not read the instructions. (The program tracks whether the students clicked the buttons to get a periodic table, drawing instructions and chemistry help.)

 

 

Spheres

Only pilot studies of the "Spheres" drawing tool have been completed at this time. In Figure 2, the user was asked to draw the structures corresponding the reactants and products for the equation:

After completing the drawing, the student is asked whether that equation is balanced, is allowed to submit a balanced equation, and then can redraw the structure based on the balanced equation. In the final step of the tutorial, the last drawing is intended to represent how the reaction would look on the molecular level if 4 moles of NO react with 3 moles of O2, a circumstance where there is a limiting reagent. Figure 3 shows a range of student responses to this last prompt (i.e., the limiting reagent): 1) a correct representation of the reagents and products with the correct stoichiometry and excess oxygen reagent remaining among the products; (2) a drawing with no connectivity of atoms to make molecules but with the appropriate number of atoms; and (3) correct number and structure of molecules without excess oxygen shown.

 


Figure 3. Student drawings for the reaction 2 NO + O2 → 2 NO2 where 4 mol of NO react with 3 mol of O2. (Top, a correct drawing; middle, a drawing with no atom connectivity; and bottom, atoms of the correct molecules but without the extra oxygen left over in the products)

 

Additional data is being gathered for cases where the initial and final representations are not a chemical reaction but some process like the dissolution of a salt. Clicker questions in classes from which the students were solicited for the research studies and previous studies of the particulate nature of matter point to difficulties when ionic compounds yield different numbers of anions and cations (e.g., MgCl2) or where complex ions like sulfate SO42- "dissolve" in water to give the component atoms.23-27

CONCLUSIONS

The browser-based tools allow the collection of substantial quantitative data, which in turn, allows the analysis of complex datasets. Thus, the word problem tool could be used to examine a large number of complexity factors and variants for the ideal gas and stoichiometry questions. Using a set of paper quizzes or assignments to test those variables would not likely be possible or would be onerous and time-consuming. Another set of studies will use this tool to examine differences in the language of the questions (i.e, how the questions were stated or explained) and how those differences are related to student success.

The drawing tools enable students to practice skills for representations like Lewis structures that a integral to instruction about simple models structures that show atom connectivity and electron ownership or for representing the particulate nature of matter. The Lewis structure data suggest that the complexity of structure is related to student success. Thus, the group of molecules or ions with only single bonds often represent the highest success rates. Students are less successful drawing structures with multiple bonds and least successful drawing structures of ions with multiple bonds. Student success at drawing these structures increases courses that are taken subsequent to each other; for most structures, students of organic chemistry have the best success rates. Strategies for further improving success have been suggested by Cooper et al.22

Eye-tracking studies have been used to provide additional information, particularly about student use of the tools.7,8 Thus, more successful students spent less time in planning and overall in completing the word problems, a result confirmed by the "think-aloud" data that was collected simultaneously while they were gazing at the word problems.8 Eye-tracking studies about molecular representations are in a pilot stage; whether students behave like experts or novices in matching appropriate structures to proton NMR spectral data can be determine from related studies.12,13 Experts follow very different gaze pathways than their novice counterparts.

The described experiments were informed by research questions about behavior of introductory chemistry students rather than devising interventions. In other words, a goal was to categorize the difficulties and not necessarily the best ways to address them... basic rather than applied research. However, these research results fit into the context of a much broader set of evidence and the interventions that they suggest. In the word problems on gas laws and stoichiometry, the questions could be categorized as exercises-a set of algorithmic steps that all students should easily accomplish. By adding cognitive complexity in the form of unit changes, spurious facts, and format changes, the questions could be made much more difficult. Our results and those from the cognitive load literature, particularly in mathematics, suggest a strategy for instruction. Students should be introduced to a topic via a conceptual understanding followed by a set of examples and problems that increase in complexity. Mastery of simple examples can be followed by a series of challenges that increase the difficulty, utilitizing the variables and factors that come from the published studies.6-8 Thus, because volume and temperature are directly proportional (and the pressure and number of moles remains constant), the ratio of volume to temperature (i.e., V/T) is constant. Knowing three of four values enables one to calculate the fourth item, an unknown. Once the students master these ideas, they can be asked to solve similar problems in which a format or unit changes. Ultimately, the student can "ramp up" their skills to any combination of complexity items.

The Lewis tool enables students to draw Lewis structures using a specific interface and a specific group of structures. Drawing structures and its limitations have been discussed by Cooper.22 The software in the studies described here is able to "grade" a student-drawn submission and decide on simple errrors-drawing the wrong atoms, the wrong number and placement of electrons and the absence of a charge in an ion. But there are limitations. A student is not actually drawing the structure like they would on a piece of paper. The tool adds complexity, including testing whether students have mastered how to use the tool; getting a question wrong could involve this use of the tool rather than their conceptual understanding of these kinds of structures or their ability to draw a correct response using a different set of tools or just a piece of paper and pencil. The Lewis structure in the tool allows drawing with few constraints but does come with some caveats. The studies do enable an instructor to know more about common errors and difficulties, something that could be transferred to instruction. Again, instruction and practice should involve "ramping up" the level of difficulty to include non-bonding electrons and then multiple bonds and then ions.

The "spheres"drawing tool also does not include many constrains but does require the user to know how to use the interface. Students have less experience with being asked to draw such representations. Furthermore, only preliminary results are available at this point, but the expectation would be that a series of examples of increasing complexity would be most beneficial.

Using spectral data to match with an organic structure requires a complex set of skills-understanding chemical shifts, spin-spin coupling, and magnetic equivalence in addition to an appreciation for lines-and-letters representations of structures. There is a clear difference between undergraduate students in the second semester organic course and those with considerably more experience (i.e., advanced undergraduates and graduate students conducting research).13 The study provided both sets of users with a very short time to match the structure and spectral data (i.e., 1 minute), a variable that would favor the more experienced group and one that was not examined in the study. For example, do the novices just need more time? The data suggest difference in approach so it is not likely that time alone accounts for the differences. The data do suggest that experience makes a difference. Instruction should clearly provide the opportunity to do more examples and to practice one's skills.

 

ACKNOWLEDGEMENTS

Aspects of this paper were presented at the 39th annual meeting of Sociedade Brasileira de Quimica (Goiania, Brazil), "Studying student behavior and chemistry skill using browser-based tools and eye-tracking hardware". We gratefully acknowledge Fernando Galembeck, the organizer of the symposium, for the invitation and the participants for their useful discussions. The development of the tools was supported, in part, by the US National Science Foundation.

 

REFERENCES

1. Bodner, G. M.; Domin, D. S.; Univ. Chem. Educ. 2000, 4, 24.

2. Bodner, G. M.; McMillen, T. L.; J. Res. Sci. Teach. 1986, 23, 727.

3. Bodner, G. M.; Herron, J. D.; In Chemical education: Towards research-based practice; Gilbert, J. K., de Jong, O., Justi, R., Treagust, D. F., van Driel, J. H., eds.; Kluwer Academic Publishers: Dordrecht, 2002, pp. 235-266.

4. Gabel, D. L.; Bunce, D. M. In Handbook of research on science teaching and learning; Gabel, D. L., ed.; MacMillan: London, 1994.

5. Bowen, C. W.; J. Chem. Educ. 1994, 71, 184.

6. Tang, H.; Pienta, N. J.; J. Chem. Educ., 2012, 89, 988.

7. Schuttlefield, J. D.; Kirk, J.; Pienta, N. J.; Tang, H.; J. Chem. Educ., 2012, 89, 586.

8. Tang, H.; Kirk, J.; Pienta, N. J.; J. Chem. Educ., 2014, 91, 969.

9. Duchowski, A.; Eye tracking methodology: Theory and practice (Vol. 373), Springer Science & Business Media: Berlin, 2007.

10. Cranford, K. N.; Tiettmeyer, J. M.; Chuprinko, B. C.; Jordan, S.; Grove, N. P.; J. Chem. Educ., 2014, 91, 641.

11. DelParto, C.; Pienta, N. J.; unpublished results.

12. Tang, H.; Topczewski, J. J.; Topczewski, A. M.; Pienta, N. J.; Proceedings of the Symposium on Eye Tracking Research and Applications, 2012, Santa Barbara, California, USA, pp. 169-172.

13. Topczewski, J. J.; Topczewski, A. M.; Tang, H.; Kendhammer, L. K.; Pienta, N. J.; J. Chem. Ed. 2017, 94, 29.

14. Miller, G. A.; Psychol. Rev. 1956, 63, 81.

15. Sweller, J.; Learning and Instruction 1994, 4, 295.

16. Paas, F.; Renkl, A.; Sweller, J.; Educational Psychologist 2003, 38, 1.

17. Van Merrienboer, J. J.; Sweller, J.; Educational Psychology Review 2005, 17, 147.

18. Paas, F.; Van Gog, T.; Sweller, J.; Educational Psychology Review 2010, 22, 115.

19. Johnstone, A. J.; Chem. Ed. Res. Pract. 2006, 7, 49.

20. Cooper, M. M.; Grove, N.; Underwood, S. M.; Klymkowsky, M. W.; J. Chem. Educ. 2010 87, 869.

21. Cooper, M. M.; Underwood, S. M.; Hilley, C. Z.; Chem. Ed. Res. Pract. 2012, 13, 195.

22. Cooper, M. M.; Underwood, S. M.; Hilley, C. Z.; Klymkowsky, M. W.; J. Chem. Educ. 2012, 89, 1351.

23. Larkin, M.; Pienta, N. J.; unpublished results.

24. Harrison, A. G.; Treagust, D. F.; Chemical education: Towards research-based practice; Gilbert, J. K., de Jong, O., Justi, R., Treagust, D. F., van Driel, J. H., eds.; Kluwer Academic Publishers: Dordrecht, 2002.

25. Gabel, D. L.; Samuel, K. V.; Hunn, D.; J. Chem. Educ., 1987, 64, 695.

26. Novick, S.; Nussbaum, J.; Sci. Educ. 1981, 65, 187.

27. Haidar, A. H.; Abraham, M. R.; J. Res. Sci. Teach. 1991, 28, 919.

On-line version ISSN 1678-7064 Printed version ISSN 0100-4042
Qu�mica Nova
Publica��es da Sociedade Brasileira de Qu�mica
Caixa Postal: 26037 05513-970 S�o Paulo - SP
Tel/Fax: +55.11.3032.2299/+55.11.3814.3602
Free access

GN1