This article examines the impact of a technology-supported professional development strategy (FAEUC) on strengthening the beliefs, knowledge, and practices related to scientific argumentation of five university science instructors, using a pretest-posttest design without a control group. The strategy incorporated argument modeling, case analysis, and the design of argumentative tasks supported by digital tools. Findings from the Questionnaire for Evaluating the Promotion and Use of Argumentation (CUEFOAR) and the Argument Test (AT) showed marked improvements in teachers’ valuation of explicit argument-structure instruction, the promotion of analytical skills, and their ability to distinguish well-supported arguments and formulate more precise challenges. A joint reading of both instruments revealed coherence between what instructors report doing and how they actually analyze arguments, with nuances related to disciplinary background and teaching experience. Based on these results, it is recommended to make Toulmin’s structure explicit in instruction, strengthen work with refutations, and use analytic rubrics given their value for consistently assessing components of argumentation. Although the sample is small, the findings indicate a positive effect of the strategy and point to the need to consolidate common criteria and develop a discipline-specific bank of tasks.
Keywords:
scientific argumentation; professional development; evaluation; educational intervention
