1. James Laffey
  2. http://sislt.missouri.edu/author/laffeyj/
  3. Professor Emeritus
  4. Mission HydroSci
  5. https://mhs.missouri.edu/
  6. University of Missouri, University of Missouri - School of Info Sci and...
  1. Joe Griffin
  2. http://Adroit.missouri.edu
  3. Technical Director
  4. Mission HydroSci
  5. https://mhs.missouri.edu/
  6. University of Missouri
  1. Justin Sigoloff
  2. http://adroit.missouri.edu
  3. Creative Director
  4. Mission HydroSci
  5. https://mhs.missouri.edu/
  6. University of Missouri
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 5, 2020 | 09:02 a.m.

    Hi Everyone, Thanks for coming to our video. In 2019 we completed our second field test with 13 teachers and their students (a first field test was completed in 2018). The results show significant gains for scientific argumentation competencies. Our teachers also identified a high degree of positive social interaction in the classrooms (as kids were helping kids), high engagement and leadership by a number of students who were not typically strong science students, and a desire to teach with MHS in the future. We struggled with a number of technical issues as the variety and general low capabilities of the computers in the schools was a challenge for our vision of high quality visual and interactive experiences for student learning. In addition there are some technical aspects of the game that still need improvement. Overall we are encouraged by the results and plan to continue development and work toward making MHS available broadly to teachers and students.

    We look forward to hearing your comments and addressing any questions. You can also learn more at MHS.missouri.edu

     
    1
    Discussion is closed. Upvoting is no longer available

    Ariel O'Brien
  • Icon for: Luke West

    Luke West

    Graduate Student
    May 5, 2020 | 02:48 p.m.

    Hi James and the MHS team—excellent work on the game and homepage!

    I'm especially curious about the argument structure tool users interact with after collecting evidence. Does it trigger once learners collect all possible evidence? Or if some users collect more evidence, do they then have access to more evidence when they're forming their argument?

    In our project this year, we have also been exploring student problem solving in an open-world context, and have similarly found the need for some cognitive scaffolding, particularly for multi-step problems. Your team's solution, interacting with Dr. Topo in the argument structure interface, seems like a great way to help the learner consolidate information and practice scientific argumentation, without giving too much away. 

  • Icon for: Joe Griffin

    Joe Griffin

    Co-Presenter
    Technical Director
    May 6, 2020 | 12:55 p.m.

    Thanks for the questions Luke. 

    Yes our argumentation engine is triggered once players have collected all the available evidence. We initially had it more open and envisioned players going back and forth between the argumentation engine and the environment, but ultimately it was too complex. 

    Throughout iterations based on classroom feedback, it was important to prime students at multiple key points in the task to really focus them on the current task, while still relating each step to the over arching problem. 

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 5, 2020 | 05:17 p.m.

     HI Luke,

    Thanks for checking out our video and for your question. We follow Osborne's approach to using learning progressions to teach scientific argumentation, so in the early units the student is only doing part of an argument but as they progress through units they take on responsibility for more complete arguments. The evidence they collect as they go through the unit is not represented as a set of evidence they can then use in the argument engine. Rather the argument engine somewhat abstracts or selectively uses some of the evidence to make a manageable task and something for which we can provide feedback when students make errors. The arguments themselves were crafted to be meaningful activities that emphasize the level of progress the student has made on Osborne's progressions and so that when students made types of errors we could provide feedback to shape their understanding.

    Hope that is clear.... Thanks

     Also, thanks for sharing a link to your project....I'll check it out later.

     
    1
    Discussion is closed. Upvoting is no longer available

    Luke West
  • Icon for: Raffaella Borasi

    Raffaella Borasi

    Higher Ed Faculty
    May 5, 2020 | 09:36 p.m.

    This video really gave a good sense of the experience of using your game. I have been looking for examples of uses of digital technology that truly transform STEM education - and I think your game is one!

    Since our Noyce projects are ultimately designed to prepare STEM teachers who are empowered to use technology to transform their teaching, I'm curious about not just the students, but the teachers reactions to your product.  How did it change the way they teach science?

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 6, 2020 | 08:57 a.m.

    Raffaella, Thanks for checking out our video and for your comments and question. I am not sure that a 10 day study using our game changed how they will teach science, but I think it impacted them in very positive ways. The first things that stands out in their post interviews was the joy they expressed in seeing kids that were not typically engaged show abilities and leadership that surprised the teacher. A second was an appreciation for the power of the role they could play when the kids were engaged in game play of being the sage on the side who could help individuals see key points or take important steps rather than having too manage the full class. We did see great variety in the initial capabilities of the teachers to understand gameplay and support the students and also in how they managed the game play classroom. We are looking to up our own "game" of how we can best support teachers during MHS.

  • Icon for: Cheryl Bodnar

    Cheryl Bodnar

    Higher Ed Faculty
    May 6, 2020 | 08:36 a.m.

    Great work with this game.  The graphics are very well done and the storyline is engaging which should help with being able to reach the target audience.

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 6, 2020 | 08:59 a.m.

    Cheryl, Thanks for checking out our video and for your comments about the graphics and storyline. There are lots of game types but we think the narrative story where the player has a role in problem solving has great potential to engage and provide rich learning.

  • Icon for: Margo Murphy

    Margo Murphy

    Facilitator
    Science Instructor
    May 6, 2020 | 11:04 a.m.

     Really interesting.. and appreciate the focus on skill building within the game environment.  I would love to hear more about what it looks like in the game for a student to frame and defend an argument.  You mention giving students feedback as they progress through the game - how is this done..  Is it an AI type response or a person?  Is this a role a teacher would play?  I went to your website and saw one teacher testimonial and would love to hear more about teacher responses.  Thanks --- good work happening!

     
    1
    Discussion is closed. Upvoting is no longer available

    Michael I. Swart
  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 7, 2020 | 08:23 a.m.

    Hi Margo, Thanks for your interest. You can see what we call our argumentation engine at 2:36 of the video which is the mechanism for articulating an argument and getting feedback. We call it a solar system for obvious reasons but it allows students to select amongst better or worse choices for evidence and reasoning to make a claim for answering a driving question which the student sees on the bottom of the screen. This mechanism allows for arguing with more than simple choices and requires actively thinking through which assembly of CRE are best. However, the arg engine is just the place for articulating the argument. We hope that playing the game makes the students attentive to using evidence and reasoning in making claims, that devices like the info sheet at 2:30 model organizing data, that one of our NPCs Dr. Toppo provides advance organizers before taking on the argument, there is an arcade game-like part of the game to help the student practice making distinctions between CRE, and a range of other things sprinkled throughout the game to make making arguments more important and make the student more capable.

    We have a dashboard that shows the teacher how the students are progressing and how many tries (sometimes just trial and error which doesn't work very well) kids are making and that is a place where the teacher can step in and help reset the student and sometimes work through the argument with him or her.

    We have teacher interviews and hope to be able to report with more detail soon in an article, but for now I'll just say that the teachers come to MHS with a wide range of experience and trepidation about teaching with such an extensive game, unfortunately a lot of their experience with the game and their students has been solving technical problems (we are working to improve that), have commented that they really like how the game play creates a very positive social experience in the class, and really like how kids who typically are not engaged often become leaders. Almost all of the teachers expressed a desire to teach with MHS again when and if possible,

     
    1
    Discussion is closed. Upvoting is no longer available

    Margo Murphy
  • Icon for: Rebecca Ellis

    Rebecca Ellis

    Researcher
    May 6, 2020 | 04:38 p.m.

    This is a really neat idea! It reminds me of the Nancy Drew interactive games (HerInteractive) I used to play when younger, where I did puzzles and looked for clues to solve a mystery, or like Geniventure (Concord Consortium) where students work with genetics to breed dragons.

    When students interact with the NPCs, are they given choices about what to say? How do the person-to-person interactions work? I am curious about how you bridge student-driven argumentation within the framework of a designed game. Also, does the game need to be run on a computer, or will it work on touch devices? Making the shift to touch devices has been the newest obstacle for our development team on ConnectedBio, although I'm pleased to report that the progress is moving steadily!

     
    1
    Discussion is closed. Upvoting is no longer available

    Michael I. Swart
  • Icon for: Joe Griffin

    Joe Griffin

    Co-Presenter
    Technical Director
    May 6, 2020 | 11:34 p.m.

    Hi Rebecca,

    Some of our dialogue is conversational and allows for players to make choices, but some is also sort of mission updates with NPCs talking to the player. The game is fully voice acted so often characters will be talking while the player is navigating the environment.

    It currently runs on Windows and Mac only, but we're currently looking into a tablet port. Maybe we could pick your brain over email, about your experience?

  • Icon for: Rebecca Ellis

    Rebecca Ellis

    Researcher
    May 7, 2020 | 10:55 a.m.

    Hi Joe, 

    Sure! I'd love to chat about this more. I'm on the research side of my project, so if you have questions that I don't know the answer to, I'll loop some of the developers into the thread, too. 

    Looking forward!

  • Icon for: Joe Griffin

    Joe Griffin

    Co-Presenter
    Technical Director
    May 12, 2020 | 05:22 p.m.

    Hi Rebecca, 

    It would be awesome to hear about any big hurdles you had to overcome in order to accomplish the port? Was there any unexpected issues that popped up? And possibly if there were there any hard constraints you had to accommodate. 

    We've been thinking of issues like having to redo the UI and controls, and possibly reducing graphics in some places. 

    My email is GriffinJG@missouri.edu if you have time to respond after this showcase. 

    Thanks again :)

  • Icon for: Lynda McGilvary

    Lynda McGilvary

    Facilitator
    Communications Director
    May 6, 2020 | 10:46 p.m.

    This looks like so much fun, both for the development team and for the students participating in your field test! How large is your development team? Are you outsourcing the programming and graphic design or does your team include that expertise?

  • Icon for: Joe Griffin

    Joe Griffin

    Co-Presenter
    Technical Director
    May 6, 2020 | 11:38 p.m.

    Thanks Lynda, it has been a lot of fun, and the kids seem to be enjoying it too :)

    During the course of the project we've had over 30 people on the team including, designers, artists, developers, UX, data analysts, science educators, and testers. We've contracted with some industry professionals along the way, but the entire production process has been done in house. 

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 7, 2020 | 08:03 a.m.

    Just to add a bit of context to Joe's answer....many of those have been students (some extremely talented) or part time contractors. At the height of our development work we probably had a team of 6 or so full time people.

  • Icon for: K. Renae Pullen

    K. Renae Pullen

    Facilitator
    Specialist
    May 7, 2020 | 09:07 p.m.

    Well this is really cool! Can you describe the professional learning you had for teachers? What do you see happening with Mission HydroSci in the future?

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 10, 2020 | 09:01 a.m.

    Thanks for checking out our video and your questions. For the field test teachers participated in an asynchronous set of online course activities that took about 3 hours, had a guidebook to help them during class periods, had an opportunity to play the game (which most did not or did not get too far), and had a facilitator that sent out daily suggestions and responded to questions and issues. We are currently working on a scale up grant that would enhance all of those options including using video to show a lot of the game play activity. Our goal is to have have MHS available to schools and teachers in some form of a product as soon as possible.....but still quite a bit of work to do.

     
    1
    Discussion is closed. Upvoting is no longer available

    K. Renae Pullen
  • Icon for: Michael I. Swart

    Michael I. Swart

    Researcher
    May 11, 2020 | 12:19 p.m.

    A fun virtual adventure in scientific argumentation and problem solving! It is great that there is a dashboard for the teachers and how it allows teachers to help students and sometimes work through the argument with him or her.

    In one comment above, regarding the triggering of the argumentation engine - this is very important, as it would see that getting all the facts summarily vs. getting them piece-wise would train different but related skillsets.  Can you expound on how the engine abstracts or selectively uses some of the evidence to make manageable tasks/ provide feedback when students make errors?  Are these done in real time as the player collects evidence.  

  • Icon for: James Laffey

    James Laffey

    Lead Presenter
    Professor Emeritus
    May 12, 2020 | 09:12 a.m.

    Michael, Thanks for checking out our project. I'll try to be clear about how argumentation works in the game, but it is not one thing....there are numerous aspects of learning to make an argument which culminate in using the argumentation engine in each unit. For example in Unit 3 Sam's garden is not growing because of pollution. Our player traces the source of pollution up the river primarily by throwing sensors to check if pollution is present at points on the river. Players take different routes as they work there way to the pollution so they collect different amounts of evidence and probably have a bit of a different understanding of the river flow and places that are polluted because of their individual choices. Once they have found where the pollution is coming from they get some refreshing about doing an argument and orienting toward the aspect of argumentation being emphasized in this unit (the game is setup following Osborne's progression approach to learning argumentation). Then the player enters the argumentation engine as a way of convincing one of the other players who controls resources that they should come and dig up the area and extract the pollution, but who is at first not convinced that they should spend their resources.

    In the argumentation engine our player selects among claims for where the pollution is....based on what they discovered, and then selects evidence which is an abstracted version of what students would have collected as well as reasoning. The argument uses the evidence and context the player has come to understand but implements a crafted version of it so we can provide appropriate feedback to errors the student make.  Hope that helps clarify. best

     
    1
    Discussion is closed. Upvoting is no longer available

    Michael I. Swart
  • Icon for: Paul Seeburger

    Paul Seeburger

    Higher Ed Faculty
    May 12, 2020 | 04:12 p.m.

    This project looks quite amazing!

    The graphics are phenomenal and I can see real merit in the concepts learned through the game play.  I could see this being useful for even my multivariable calculus students, using the topographical map (contour plot) to determine where they are in the game environment!

    This basic game structure could be adapted to help students learn a large number of different concepts in a very enjoyable way.

    We could discuss gradient vectors, paths of steepest ascent (or descent) and so many other things from multivariable calculus.

  • Icon for: Joe Griffin

    Joe Griffin

    Co-Presenter
    Technical Director
    May 12, 2020 | 05:26 p.m.

    Hi Paul,

    Thank you, our artists worked really hard on the project!

    Making a 3D adventure to explore and learn multivariable calculus sounds like a fun project we would be excited to collaborate on :) 

  • Icon for: Paul Seeburger

    Paul Seeburger

    Higher Ed Faculty
    May 12, 2020 | 05:47 p.m.

    We should talk sometime.  Have you seen my CalcPlot3D project?  It does not yet use VR, but you can use 3D glasses with it.  =)  Its primary focus is on helping students to visualize multivariable calculus concepts.  It's also been used for visually exploring concepts in differential equations, algebra, topology, chemistry, physics, biology, and various additional engineering and science classes.  See: CalcPlot3D.

    But the unique aspect of your project is the immersion of the concepts into something connected to real-life experience, and of course the gaming aspect adds a new level of motivation for many students.

    Do students earn a score in your game?  Can they win?