Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM

Mark Urban-Lurain, Melanie M. Cooper, Kevin C. Haudek, Jennifer Julia Kaplan, Jennifer K. Knight, Paula P. Lemons, Carl T. Lira, John E. Merrill, Ross Nehm, Luanna B. Prevost, Michelle Kathleen Smith, Maryanne Sydlik

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Improving STEM education requires valid and reliable instruments for providing insight into student thinking. Constructed response (CR) assessments reveal more about student thinking and the persistence of misconceptions than do multiple-choice questions, but require more analysis on the part of educators. In the Automated Analysis of Constructed Response (AACR) Research Group (www.msu.edu/~aacr) we have developed constructed response versions of well-established conceptual assessment inventories and created computer automated analysis resources that predict human ratings of student writing about these topics in introductory STEM courses. The research uses a two-stage, feature-based approach to automated analysis of constructed response assessments. First, we design items to identify important disciplinary constructs based on prior research. The items are administered via online course management systems where students enter responses. We use lexical analysis software to extract key terms and scientific concepts from the students' writing. These terms and concepts are used as variables for statistical classification techniques to predict expert ratings of student responses. The inter-rater reliability (IRR) between automated predictions and expert human raters is as high as IRR between human experts. We recently received another round of funding to extend our work to provide an online community where instructors may obtain, score and contribute to the library of items and resources necessary for their analyses. We provide an overview of the goals of the project and introduce the opportunities to participate in the development of a national network of faculty using these techniques.

    Original languageEnglish (US)
    Title of host publicationASEE Annual Conference and Exposition, Conference Proceedings
    PublisherAmerican Society for Engineering Education
    StatePublished - 2014
    Event121st ASEE Annual Conference and Exposition: 360 Degrees of Engineering Education - Indianapolis, IN, United States

    Other

    Other121st ASEE Annual Conference and Exposition: 360 Degrees of Engineering Education
    CountryUnited States
    CityIndianapolis, IN
    Period6/15/146/18/14

    Profile

    Students
    Education

    ASJC Scopus subject areas

    • Engineering(all)

    Cite this

    Urban-Lurain, M., Cooper, M. M., Haudek, K. C., Kaplan, J. J., Knight, J. K., Lemons, P. P., ... Sydlik, M. (2014). Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM. In ASEE Annual Conference and Exposition, Conference Proceedings American Society for Engineering Education.

    Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM. / Urban-Lurain, Mark; Cooper, Melanie M.; Haudek, Kevin C.; Kaplan, Jennifer Julia; Knight, Jennifer K.; Lemons, Paula P.; Lira, Carl T.; Merrill, John E.; Nehm, Ross; Prevost, Luanna B.; Smith, Michelle Kathleen; Sydlik, Maryanne.

    ASEE Annual Conference and Exposition, Conference Proceedings. American Society for Engineering Education, 2014.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Urban-Lurain, M, Cooper, MM, Haudek, KC, Kaplan, JJ, Knight, JK, Lemons, PP, Lira, CT, Merrill, JE, Nehm, R, Prevost, LB, Smith, MK & Sydlik, M 2014, Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM. in ASEE Annual Conference and Exposition, Conference Proceedings. American Society for Engineering Education, 121st ASEE Annual Conference and Exposition: 360 Degrees of Engineering Education, Indianapolis, IN, United States, 15-18 June.
    Urban-Lurain M, Cooper MM, Haudek KC, Kaplan JJ, Knight JK, Lemons PP et al. Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM. In ASEE Annual Conference and Exposition, Conference Proceedings. American Society for Engineering Education. 2014.

    Urban-Lurain, Mark; Cooper, Melanie M.; Haudek, Kevin C.; Kaplan, Jennifer Julia; Knight, Jennifer K.; Lemons, Paula P.; Lira, Carl T.; Merrill, John E.; Nehm, Ross; Prevost, Luanna B.; Smith, Michelle Kathleen; Sydlik, Maryanne / Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM.

    ASEE Annual Conference and Exposition, Conference Proceedings. American Society for Engineering Education, 2014.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    @inbook{893c35d3e5e04fce88ad1193887868fd,
    title = "Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM",
    abstract = "Improving STEM education requires valid and reliable instruments for providing insight into student thinking. Constructed response (CR) assessments reveal more about student thinking and the persistence of misconceptions than do multiple-choice questions, but require more analysis on the part of educators. In the Automated Analysis of Constructed Response (AACR) Research Group (www.msu.edu/~aacr) we have developed constructed response versions of well-established conceptual assessment inventories and created computer automated analysis resources that predict human ratings of student writing about these topics in introductory STEM courses. The research uses a two-stage, feature-based approach to automated analysis of constructed response assessments. First, we design items to identify important disciplinary constructs based on prior research. The items are administered via online course management systems where students enter responses. We use lexical analysis software to extract key terms and scientific concepts from the students' writing. These terms and concepts are used as variables for statistical classification techniques to predict expert ratings of student responses. The inter-rater reliability (IRR) between automated predictions and expert human raters is as high as IRR between human experts. We recently received another round of funding to extend our work to provide an online community where instructors may obtain, score and contribute to the library of items and resources necessary for their analyses. We provide an overview of the goals of the project and introduce the opportunities to participate in the development of a national network of faculty using these techniques.",
    author = "Mark Urban-Lurain and Cooper, {Melanie M.} and Haudek, {Kevin C.} and Kaplan, {Jennifer Julia} and Knight, {Jennifer K.} and Lemons, {Paula P.} and Lira, {Carl T.} and Merrill, {John E.} and Ross Nehm and Prevost, {Luanna B.} and Smith, {Michelle Kathleen} and Maryanne Sydlik",
    year = "2014",
    booktitle = "ASEE Annual Conference and Exposition, Conference Proceedings",
    publisher = "American Society for Engineering Education",

    }

    TY - CHAP

    T1 - Expanding a national network for automated analysis of constructed response assessments to reveal student thinking in STEM

    AU - Urban-Lurain,Mark

    AU - Cooper,Melanie M.

    AU - Haudek,Kevin C.

    AU - Kaplan,Jennifer Julia

    AU - Knight,Jennifer K.

    AU - Lemons,Paula P.

    AU - Lira,Carl T.

    AU - Merrill,John E.

    AU - Nehm,Ross

    AU - Prevost,Luanna B.

    AU - Smith,Michelle Kathleen

    AU - Sydlik,Maryanne

    PY - 2014

    Y1 - 2014

    N2 - Improving STEM education requires valid and reliable instruments for providing insight into student thinking. Constructed response (CR) assessments reveal more about student thinking and the persistence of misconceptions than do multiple-choice questions, but require more analysis on the part of educators. In the Automated Analysis of Constructed Response (AACR) Research Group (www.msu.edu/~aacr) we have developed constructed response versions of well-established conceptual assessment inventories and created computer automated analysis resources that predict human ratings of student writing about these topics in introductory STEM courses. The research uses a two-stage, feature-based approach to automated analysis of constructed response assessments. First, we design items to identify important disciplinary constructs based on prior research. The items are administered via online course management systems where students enter responses. We use lexical analysis software to extract key terms and scientific concepts from the students' writing. These terms and concepts are used as variables for statistical classification techniques to predict expert ratings of student responses. The inter-rater reliability (IRR) between automated predictions and expert human raters is as high as IRR between human experts. We recently received another round of funding to extend our work to provide an online community where instructors may obtain, score and contribute to the library of items and resources necessary for their analyses. We provide an overview of the goals of the project and introduce the opportunities to participate in the development of a national network of faculty using these techniques.

    AB - Improving STEM education requires valid and reliable instruments for providing insight into student thinking. Constructed response (CR) assessments reveal more about student thinking and the persistence of misconceptions than do multiple-choice questions, but require more analysis on the part of educators. In the Automated Analysis of Constructed Response (AACR) Research Group (www.msu.edu/~aacr) we have developed constructed response versions of well-established conceptual assessment inventories and created computer automated analysis resources that predict human ratings of student writing about these topics in introductory STEM courses. The research uses a two-stage, feature-based approach to automated analysis of constructed response assessments. First, we design items to identify important disciplinary constructs based on prior research. The items are administered via online course management systems where students enter responses. We use lexical analysis software to extract key terms and scientific concepts from the students' writing. These terms and concepts are used as variables for statistical classification techniques to predict expert ratings of student responses. The inter-rater reliability (IRR) between automated predictions and expert human raters is as high as IRR between human experts. We recently received another round of funding to extend our work to provide an online community where instructors may obtain, score and contribute to the library of items and resources necessary for their analyses. We provide an overview of the goals of the project and introduce the opportunities to participate in the development of a national network of faculty using these techniques.

    UR - http://www.scopus.com/inward/record.url?scp=84905160718&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84905160718&partnerID=8YFLogxK

    M3 - Conference contribution

    BT - ASEE Annual Conference and Exposition, Conference Proceedings

    PB - American Society for Engineering Education

    ER -