Icon for: Leanne Ketterlin Geller

LEANNE KETTERLIN GELLER

Southern Methodist University
Public
Choice
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Brian Kruse

    Brian Kruse

    Director, Teacher Learning Center
    May 4, 2020 | 03:21 p.m.

    Thank you for sharing!  I am interested in learning more about how the learning progressions were designed to help learners develop spatial reasoning skills.  We found this is a challenge for early elementary students, particularly when predicting where an object such as the Moon will appear on subsequent days.  Also, the act of correctly orienting their shadow drawings when some time elapses between observations.  

     
    1
    Discussion is closed. Upvoting is no longer available

    Maryam Zolfaghari
  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 4, 2020 | 05:43 p.m.

    Thank you for watching our video and sharing your interest in spatial reasoning. Your examples of spatial reasoning using objects in children's environment (e.g., moon, shadow) is interesting. I would love to learn more.

    We have engaged in an iterative process of articulating the learning progression for spatial reasoning. We began with an extensive literature review and then narrowed in on two key components of the learning progression that then branch into more skills with finer grain size. As we finalize these progressions, I'm happy to share them with you. 

  • Icon for: Brian Kruse

    Brian Kruse

    Director, Teacher Learning Center
    May 4, 2020 | 06:50 p.m.

    We weren't specifically looking at spatial reasoning in our research.  We did however, notice some challenges the students had, particularly related to where the moon appears in a slightly different place each day at the same time, then "disappears" for a time, reappearing on the other side of the sky.  They were challenged with completing the sequence spatially, though they were able to do so related to the shape/phase of the moon.  Of course, this involves modeling, which in most cases is not something early elementary learners can do.  We are curious about the specifics of what you are doing to encourage spatial reasoning in the young learners you work with in your study.

  • Icon for: Jacqueline Genovesi

    Jacqueline Genovesi

    Facilitator
    Vice President
    May 4, 2020 | 03:49 p.m.

    Developing early spacial reasoning is such an important skill. I'm interested in knowing more about what types of reasons you were trying to elicit and how you measure that reasoning.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 4, 2020 | 05:51 p.m.

    Thank you for watching our video and engaging with us.

    We have identified two key aspects of spatial reasoning for children in grades K-2: (1) students' reasoning spatially within objects by understanding that objects have properties and can be transformed in space, and (2) students' reasoning spatially between objects by understanding that objects have locations in space that are relative to other objects. 

    We designed a number of tasks that are currently under development to evaluate if they are truly eliciting the desired reasoning skills. We are excited to share these as the research unfolds. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Jacqueline Genovesi
  • Icon for: Maisha Moses

    Maisha Moses

    Informal Educator
    May 9, 2020 | 12:06 p.m.

    This is a very interesting comment.  It makes me think about our work on developing a learning progression for the mathematical concept of function.  I can't help but think that these aspects of spatial reasoning are related to the foundational levels on the learning progression for the concept of function that we've been working to validate. 

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 4, 2020 | 05:39 p.m.

    Thank you for taking the time to watch our video! Our research team is excited to share the work we've done on this project and look forward to your feedback. We welcome your comments about our research efforts to date, as well as the future directions of the classroom assessment resources we are creating.

    We are also interested in hearing about your experiences with the early mathematics constructs of numeric relational reasoning and spatial reasoning. Please share any interesting activities you've seen teachers and/or parents use to elicit these reasoning skills in K-2 children. 

    You are also welcome to email me at lkgeller@smu.edu and follow me on Twitter @KetterlinGeller

    Thank you for watching our video!

  • Josephine Lawrence

    May 5, 2020 | 02:18 p.m.

    This is a great start for students learning how to execute math strategies in different ways. 

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 6, 2020 | 05:02 p.m.

    Thank you for watching and commenting on our video. 

  • Icon for: Josephine Lawrence

    Josephine Lawrence

    May 5, 2020 | 02:18 p.m.

    This is a great start for students learning how to execute math strategies in different ways. 

  • Icon for: Candace Walkington

    Candace Walkington

    Associate Professor
    May 5, 2020 | 02:36 p.m.

    These assessments look great! I love the use of different modalities in the video, the role of teachers in the research, and the focus on conceptual understanding.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 6, 2020 | 05:03 p.m.

    Thanks for watching our video. I appreciate your feedback, Candace! 

  • Icon for: Abigail Levy

    Abigail Levy

    Facilitator
    Distinguished Scholar
    May 5, 2020 | 03:22 p.m.

    I applaud your use of researchers, teachers, and students in your development process. I assume there were important areas of convergence across the three groups, as well as divergence. Could you explain what those areas of convergence and divergence were, and share what was surprising about this to you?

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 6, 2020 | 05:11 p.m.

    Thank you for asking such an insightful question. Yes, we found multiple areas of convergence and divergence as we applied the findings to reconciling the learning progression and better understanding the item models. Because we were engaging with the stakeholders in different ways, the information we gained from each group was not in direct conflict or alignment. However, as we were trying to understand the details of the development of students' thinking, the sequence or progression of understanding, and the ways in which the concepts intertwine, we gathered different insights from each group. To date, we have identified the areas of divergence (in particular) and will be looking toward our next study to help us better understand these findings.

    An area that surprised us was related to differences we observed in the relative importance of some concepts across stakeholders. For example, researchers identified some aspects of both constructs as essential for specific grades, but when gathering data about the frequency and importance of the skill from educators, there were discrepancies. We view these as important findings to inform both research and practice. 

  • Icon for: Abigail Levy

    Abigail Levy

    Facilitator
    Distinguished Scholar
    May 7, 2020 | 09:55 a.m.

    This is such an interesting observation, and one that resonates with work we've done with professors across science disciplines and high school teachers. The two groups brought different perspectives/opinions/depth of knowledge about their particular science discipline, which informed their ideas about what was important for students to know, be able to do, and why. NGSS can provide the criteria for determining the importance of particular concepts for the purposes of your current project, or is this something that can wait for a future study? If not, do you think you'll navigate these decisions?  

  • Icon for: Chris Mainhart

    Chris Mainhart

    K-12 Teacher
    May 5, 2020 | 04:07 p.m.

    Working as a STEAM Coach, I am always looking for robust assessment tools that can support the Pre-K to 3 staff that I work with. I need to take a deeper dive into the MMaRS assessment tools. How much time does a typical one-on-one interview take?

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 6, 2020 | 05:16 p.m.

    Thank you for your question. We appreciate the work you do to support teachers and students! We understand the challenges of finding time to administer assessments. In fact, we have a Teacher Advisory Panel who has been providing meaningful input on the role their current assessments play in their decision making, their needs from a new assessment (like ours), and the feasibility and usability of our items.

    Currently, our interviews last about 30 minutes and cover different sections of the learning progression for each construct. By the end of the project, our goal is to have classroom assessment resources that can be administered in 10-15 minute interviews. 

    Thanks again and please keep in touch.

  • Icon for: Sarah Powell

    Sarah Powell

    Researcher
    May 6, 2020 | 09:14 a.m.

    Hi Leanne! This is really great to see the early mathematics measures coming together. I'm very interested in seeing your cognitive intervention and learning of the results. Do you have any preliminary information to share? I could see using such an interview in some of my own work.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 6, 2020 | 05:18 p.m.

    Thanks for watching our video, Sarah! We have administered the cognitive interviews with children in grades K-2 for both constructs. They went through a rigorous development process, and will serve as the basis for our item models. These items will then be put into play as the operational assessment. I'm happy to share more information with you.  

  • Icon for: Sarah Powell

    Sarah Powell

    Researcher
    May 6, 2020 | 06:21 p.m.

    Super. Let's chat about that soon!

  • Icon for: Alison Billman

    Alison Billman

    Facilitator
    Director of Early Elementary Curriculum
    May 6, 2020 | 06:00 p.m.

    I appreciate the work you are doing to help teachers meet the needs of their students. Assessment--knowing what kids know--in the primary grades is such a challenge when we know that it is during the one-to-one conversations that teachers are able to gain deeper insights into a child's facility with different concepts. If I understand correctly, the assessments you are designing are aligned to a learning progression. Does that mean that the assessment itself measures incremental steps in the learning progression? If so, would you complete the entire assessment with each child or stop at a certain point based on observations of some level of understanding? 

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 9, 2020 | 09:50 a.m.

    Thanks for watching our video and meaningfully engaging in our research. Yes, we are designing the assessments aligned to learning progressions (one for numeric relational reasoning and one for spatial reasoning). As you said, because the assessments measure students' incremental development along the progression, our intention is to stop the assessment at a certain point. In our previous work with older children (grades 2-8), we designed similar assessments and implemented a stopping rule after a specific threshold was reached. We are hoping similar methods can be employed for younger children. 

    Thank you again for your question.  

     
    1
    Discussion is closed. Upvoting is no longer available

    Scott Bellman
  • John Stiles

    May 6, 2020 | 11:27 p.m.

    When designing reasoning assessments that include natural phenomena, it is important to be guided by the NGSS performance indicators for particular grades. Otherwise, the questions may not be appropriately suited for the level of developed reasoning ability.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 9, 2020 | 09:51 a.m.

    Thank you for your suggestion. We will look into this resource. 

  • Stephanie Otaiba

    Researcher
    May 9, 2020 | 08:11 a.m.

    This is an important and timely project. I appreciated how your video showed the constructs and children enacting the constructs. It was also a good idea to involve your research partners. Your figures showing the iterative development made the process clear.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 9, 2020 | 09:51 a.m.

    Thank you for watching our video, Stephanie! I appreciate your comments. 

  • Icon for: Edith Graf

    Edith Graf

    Researcher
    May 9, 2020 | 01:50 p.m.

    Thank you for sharing your work on this project! I appreciate the use of cognitive interviews as part of an iterative design process, as well as the inclusion of a spatial reasoning component.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 12, 2020 | 07:06 p.m.

    Thanks for watching our video, Aurora. I appreciate our feedback. 

  • Lindy Crawford

    Higher Ed Faculty
    May 11, 2020 | 10:52 a.m.

    I respect the research being conducted by RME and appreciate your team's understanding of the need to involve all stakeholders in the development and validation of mathematics measures.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 12, 2020 | 07:08 p.m.

    Thanks for stopping by and commenting on our work, Lindy. 

  • Icon for: Dave Miller

    Dave Miller

    Higher Ed Faculty
    May 12, 2020 | 09:29 a.m.

    Thanks for sharing this valuable project. I'm intrigued with the development of the cognitive interview protocol. I'm wondering about the study size and what are your plans for next steps? Thanks!

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 12, 2020 | 07:11 p.m.

    Thank you for watching our video and engaging with us in discussion. We conducted a total of 48 cognitive interviews with students in grades K-2 for spatial reasoning and 64 for numeric relational reasoning. We are using the cognitive interview data to empirically evaluate the learning progression by gathering evidence to better understanding students' reasoning and thinking around these constructs. We are also using the information to better understand how students elicit their reasoning given the tasks we designed. Thank you again for watching our video.  

  • Icon for: Deni Basaraba

    Deni Basaraba

    K-12 Administrator
    May 12, 2020 | 12:04 p.m.

    This is an interesting and important project, particularly given the need to bolster students' conceptual understanding of abstract concepts like spatial reasoning. The use of learning progressions to guide assessment development is intriguing and has the potential to provide educators with a wealth of information. Similar to other assessments developed using learning progressions, are the distractors purposefully designed to provide information about student's misconceptions or errors in thinking about concepts and skills related to numeric relational and spatial reasoning? I agree with others that it will be exciting to hear more about how data from the cognitive interviews were collected (particularly with this young sample of students) and how the results informed item development and revisions.

  • Icon for: Leanne Ketterlin Geller

    Leanne Ketterlin Geller

    Lead Presenter
    Professor
    May 12, 2020 | 07:14 p.m.

    Thank you for your question, Deni. Given the population we are working with (students in grades K-2), we are designing this assessment to be an individually administered interview as opposed to multiple choice. We may have some multi-select items that are followed up by probing questions to better understand the student's reasoning behind their selection. These responses will contribute to understanding students' conceptualizations. Thanks again for watching our video and engaging in a discussion with us!