1. Libby Gerard
  2. http://wise.berkeley.edu
  3. Lecturer, Research Scientist
  4. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  5. https://wise-research.berkeley.edu/projects/strides/
  6. University of California Berkeley
  1. Sarah Bichler
  2. Post-Doctoral Researcher
  3. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  4. https://wise-research.berkeley.edu/projects/strides/
  5. University of California Berkeley
  1. Phillip Boda
  2. Post-Doctoral Researcher
  3. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  4. https://wise-research.berkeley.edu/projects/strides/
  5. University of California Berkeley
  1. Allison Bradford
  2. Graduate Student Researcher
  3. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  4. https://wise-research.berkeley.edu/projects/strides/
  5. University of California Berkeley
  1. Emily Harrison
  2. https://wise-research.berkeley.edu/eharrison/
  3. Graduate Student
  4. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  5. https://wise-research.berkeley.edu/projects/strides/
  6. University of California Berkeley
  1. Jennifer King Chen
  2. Postdoctoral Research Scholar
  3. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  4. https://wise-research.berkeley.edu/projects/strides/
  5. University of California Berkeley
  1. Jonathan Lim-Breitbart
  2. Designer and Developer
  3. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  4. https://wise-research.berkeley.edu/projects/strides/
  5. University of California Berkeley
  1. Marcia Linn
  2. http://wise-research.berkeley.edu/mclinn
  3. Professor
  4. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  5. https://wise-research.berkeley.edu/projects/strides/
  6. University of California Berkeley
  1. Korah Wiley
  2. Graduate Student
  3. Collaborative Research: Supporting Teachers in Responsive Instruction for Developing Expertise in Science (STRIDES)
  4. https://wise-research.berkeley.edu/projects/strides/
  5. University of California Berkeley
Public Discussion
  • Icon for: Sharon Lynch

    Sharon Lynch

    Professor
    May 5, 2020 | 11:32 a.m.

    It is encouraging to see that WISE has a goal to so quickly adapt to the COVID-19 circumstances. Do you have ways to tracking any increases in uses of WISE in the last months? Plans for the future? How do you measure quality of use for teachers and/or students? 

     

    Thanks for providing this video and best wishes with this project as it goes forward.

     
    1
    Discussion is closed. Upvoting is no longer available

    Isabel Huff
  • Icon for: Phillip Boda

    Phillip Boda

    Co-Presenter
    May 5, 2020 | 12:51 p.m.

    Hi Sharon! Thank you for your kind words of support! We have seen an uptake in teachers we have worked with over the years wanting to re-engage with WISE, and revise/further customize the units through the Teacher Authoring Tool available for all WISE users, during the COVID pandemic. New teachers to WISE have also contacted us for support implementing WISE during this time - many come to 'office hours' that we have implemented to get help and FAQs. Use by teachers, like all curriculum, varies both at the teacher level (such as how they may 'open' or 'prompt' a lesson for a particular day) and the school level (such as expectations and areas of adaptation needed to meet all students' learning needs). These factors we have measured in the past through classroom observations and differences in unit design features as part of our collaborative design research projects. However, during distance learning, the more impactful differences between teacher use are directly observed through how they may use the Teacher Authoring Tool to customize the units. We also support customizations that can also occur in STRIDES based on benchmark assessment items that are scored automatically through our natural language processing algorithms that produce a report for teachers to observe distributions of student scores on that item with suggestions on how to support students to develop more sophisticated Knowledge Integration of their learning - as seen in our video. We also use tracking of log data in coordination with open-response explanations to measure students' use of the units and plausible connections to their learning. Hope this describes a bit more about WISE and the project! All the best!

     
    2
    Discussion is closed. Upvoting is no longer available

    Isabel Huff
    Albert Byers, Ph.D.
  • Icon for: Susan Kowalski

    Susan Kowalski

    Researcher
    May 5, 2020 | 12:32 p.m.

    WISE looks like a great resource for teachers who are trying to engage students in online science instruction during COVID-19. I'm curious about how WISE supports students in developing explanations. How do you support students in becoming ever more adept at this critical science and engineering practice?

  • Icon for: Phillip Boda

    Phillip Boda

    Co-Presenter
    May 5, 2020 | 01:02 p.m.

    Hi Susan! Thank you for your comment and questions! WISE supports students in developing explanations through multiple tools available for any teacher to use when customizing the units in the Teacher Authoring Tool. These include concept mapping tools with automated feedback, annotation activities to support distinguishing between previous and newly learning ideas in science, and other tasks related to more pedagogical structures such as an Idea Market where students post their science ideas/engineering design proposals and respond to their peers with feedback. We have found that it is imperative that students be given multiple opportunities to revise their previous science ideas through tasks that support distinguishing between prior knowledge and new content learning. This complex sense-making process is also supported in STRIDES by providing distributions of students' scores automatically scored through our NLP algorithms across the Knowledge Integration scale to teachers for specific benchmark assessment items in the unit, as well as DCI, CCC, and SEP scales for some items. We also suggest teachers think about possible ways to address alternative ideas students may have or their struggles to integrate their science understandings by offering activities the teacher can do related specifically to their students' scores, providing teachers with options to use moving forward or ideas to design their own tasks. Hope this clarifies some pieces! All the best! 

  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 5, 2020 | 06:07 p.m.

    Hi Susan, 

    not too much to add to Phillip's detailed answer - one way we tried to support students with respect to SEP in the "Genetics of Extinction" (GenEx) unit is by adding support activities at the end of the unit which the teacher can suggest to students when noticing that's where they need most support. For example, students who did not use evidence from a graph in the milestone item in the GenEx unit might receive a teacher comment that says: "Well done thinking about how Lizards with shorter back legs went extinct after a snake predator had been introduced into their environment! To gain more ideas, go to Step X in the unit and help Mohinder revise his explanation." The support activity with Mohinder focusses then on using evidence from a graph to explain that evolution takes place over multiple generations; students are guided by questions to gain information from the graph and then encouraged to go back to their own explanation to revise it.

    All the best,

    Sarah

  • May 5, 2020 | 04:31 p.m.

    This is a very insightful video, and of course WISE continues to evolve, improve and impress.

    The partnership with ETC for natural language processing is impressive when considering automation in online learning going to scale, extending the reach of the "instructor". Kudos!

    I'm curious, how if at all (or if even appropriate) might it leverage the prior NSF work of Jim Minstrell (who looked at student responses, and facets or "snippets" in his online system based for particular understanding of science concepts) to help diagnose student  understanding and preconceptions, providing targeted feedback?

    Easily said, I'm curious what the algorithm is to garner automated feedback (word count, key phrases, etc.). Love to hear more there if not too complex!

  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 5, 2020 | 05:52 p.m.

    Dear Albert, 

    thanks for the kind words and the inquiry into the details of our automated scoring. We are coding students' explanations with respect to the three-dimensional understanding called for by NGSS, in addition to Knowledge Integration. Depending on what the specific milestone item in a particular unit lends itself to, we have automated scoring of DCI, CCC, or SEP and an overall Knowledge Integration score. In Genetics of Extinctions for example, we score understanding of natural selection (DCI), using evidence from a graph (SEP), and Knowledge Integration. The teacher report gives an overview of how students are doing with respect to DCI and SEP, as well as how well they are integrating their ideas. With this detailed report, teachers are supported in diagnosing student understanding and noticing how students express certain ideas, and ultimately responding to these ideas in a targeted fashion. 

    Particularly during distance learning, we designed the teacher report to encourage investigation of student ideas and building on them by adding "Support Activities" in WISE units that target specific ideas (for example, using evidence from a graph). Teachers can use the analytics as well as student responses to determine which idea a student might need help with and then provide targeted support by suggesting the respective activity.

    Another example of targeted feedback from earlier this year (during regular classroom instruction) that was facilitated by the teacher report: The overview of the aggregated student scores and investigation of individual students' responses in the "Musical Instruments and the Physics of Sound Waves" unit revealed that students confuse ideas about the pitch and the amplitude of a sound wave, especially how these two concepts are depicted in a drawing. The teacher then added classroom activities that helped kids distinguish between pitch and amplitude.

    I hope this addressed your questions and others will add to address the parts I did not!

    Sarah

  • May 5, 2020 | 10:02 p.m.

    Thank you Sarah for sharing more. Absolutely agree about the importance of facilitating deeper and more transferable learning when intertwining the three dimensions of NGSS. While older research, Minstrell was quoted quite frequently in the "how student/people learn" books and its great to see leveraging the research about know areas where students might get easily confused!

    I read once, I think it was Bill Penuel (but I could be mistaken) who said that a common challenge in many LMS systems (dashboards) is that they provide quantitative scores and progress boxes on student progress, but fall short on suggesting alternative activities, resources for teachers to use in light of where a student may be struggling. Somewhat reminds me of the notion of Krajcik "educative instructional materials" that build in support for teachers during implementation with respect to say pedagogical content knowledge. Thank you very much, wonderful example!

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 01:37 a.m.

    Hi Al,

    Great to hear from you. We are inspired by the work of Jim Minstrell. Our Knowledge Integration scoring rubrics analyze the specific ideas students generate--which could be seen at facets. We encourage students to distinguish among the ideas they discover in the unit and the ideas they already have using evidence from the unit. The rubrics reward students for making links between ideas and evidence. 

    Thank you for your interest in our work.

    Cordially,

    Marcia

     
    1
    Discussion is closed. Upvoting is no longer available

    Todd Campbell
  • Icon for: Margo Murphy

    Margo Murphy

    Facilitator
    May 5, 2020 | 07:36 p.m.

    Very timely as teachers are trying to find effective resources for students in remote learning environments..  Do you have any student engagement data? and data to show student growth in understanding these topics. Also interested in knowing how much teacher time is needed to give feedback.  Thanks!  Great work!

  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 5, 2020 | 08:50 p.m.

    Hi Mary, 

    thanks for asking these questions! We're so curious to find out the answers and are currently working on answering these questions :).

    This does not stem from a systematic analysis yet, but a first impression is a) there are kids who cannot log in to online curricula because they don't have access (no internet, no computers), b) not all students who log in for a teacher's run actually do work, c) some kids actually provide very detailed answers to open response items :)! A more systematic analysis is on its way.

    Also with respect to student understanding, we will look at their learning gains but have not yet done so. From interviews with two teachers I know that some students do really great work and others do the bare minimum, it will be interesting to see to what extent the engagement predicts how much kids learn. 

    Thanks especially for your last question! I am adding it to the interview questions, I haven't asked how long it took teachers to send feedback comments. The report provided definitely saves teachers time as they get an aggregated overview and they have expressed that it takes load off of them (finding an appropriate item, looking at all the responses, identifying themes, coming up with ways to respond). The report for distance learning includes pre-made comments that teachers can send to students via copy and paste. There are multiple comments suggested and teachers can choose which one is best suited for individual students. Using the pre-made comments is probably quicker than writing your own. Teachers also opted to combine their own comments with the pre-made ones, and some teachers use this function more than others.

    I hope to have answered your questions!

    Take care,

    Sarah

     
    1
    Discussion is closed. Upvoting is no longer available

    Margo Murphy
  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 01:46 a.m.

    Hi Margo

    Thank you for your interest in our work. I add to the comments from Sarah, one interesting finding so far is that one teacher reported that some students who normally are distracted in class are actually more engaged when working remotely. We wonder if these students are better able to concentrate at home. Another possibility is that they are discussing their work with a sibling or friend. The greatest challenge teachers report is that some students have not been able to show up at all. Although districts are providing computers and internet providers are offering free accounts, some students are not able to access these services or do not have stable places to work.

    Cordially,

    Marcia

  • May 12, 2020 | 04:14 p.m.

    We need to figure out how we are going to do professional development in a project about diverse organisms, food webs and adaptations in our project at a public Aquarium (described in our video).  How do we access WISE to review whether it will be useful for inspiring and teaching teachers at this summer's virtual PD (virtual because of COVID-19)?

  • Icon for: Jonathan Lim-Breitbart

    Jonathan Lim-Breitbart

    Co-Presenter
    May 12, 2020 | 06:21 p.m.

    Hi Jeffrey! You can preview all of our curriculum materials and sign up for a free account at https://wise.berkeley.edu. As mentioned, we also provide an authoring tool that teachers and curriculum developers can use to modify or adapt existing units or create their own on any topics of interest. Let us know if you have any other questions at all.

  • Icon for: Lynda McGilvary

    Lynda McGilvary

    Facilitator
    May 6, 2020 | 03:32 a.m.

    This looks like a very well-developed and much needed program, so congratulations on your work. It must be very satisfying to see it meeting the needs of teachers and students right now! Can you explain how teachers receive training or orientation in order to effectively use these tools?

  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 6, 2020 | 08:20 p.m.

    Dear Lynda,

    to add to Marcia's response, we STRIDES researchers meet with teachers before, during and after a STRIDES unit implementation. We prepare using the report, discuss the report, and ask for teachers' reflections and experience after using the report. How detailed such "training" is depends on what a teacher wants. During distance learning, we shared "guides" - short documents that walk teachers through the rationale and how to access the report und how to use the comment function. Teachers can use it if they wish.

    Best,

    Sarah

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 03:46 a.m.

    Hi Lynda,

    Thank you for your interest and question. Teachers using STRIDES for remote instruction include spontaneous users who found our open source materials on the web (www.wise.berkeley.edu), teachers who have previously participated in face-to-face workshops, and teachers who are at schools where a teacher participated in a workshop. Spread across school sites has been rapid: when one teacher uses WISE they often introduce their peers. New teachers are mentored by veteran users. In addition, research staff often coach new users, especially in schools that are participating in a research investigation.

    Take care,

    Marcia

     
    1
    Discussion is closed. Upvoting is no longer available

    Ariel O'Brien
  • Icon for: Lisa Dierker

    Lisa Dierker

    Higher Ed Faculty
    May 6, 2020 | 10:35 a.m.

    This is such a timely initiative to learn about given our new normal. I love the natural language processing feature in the instructor dashboard!

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 07:18 p.m.

    Thank you for your interest in our work. Are you using natural language processing (NLP)? Being able to guide students to reconsider their responses to essay questions is a great advantage of NLP. IN STRIDES we also use NLP to synthesize student ideas. This really helps teachers respond to their students' ideas.

    Check out our units at wise.berkeley.edu

    Take care,

    Marcia

  • May 6, 2020 | 11:56 a.m.

    Thank you for sharing Marcia and team,

    The outbreak of COVID-19 certainly brings resources such as WISE into greater focus.  I love your use of office hours for instructors to have their questions addressed.

    The discussion tool layout looks compelling. Something I find challenging with using something like Google Documents is supporting interaction across students.  What type of strategies do you use to guide student discussions in WISE units?

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 07:06 p.m.

    Thank you for asking. WISE has an Idea Market feature where students can comment on relevant questions and help each other resolve a dilemma. In face-to-face interactions teachers draw on student comments they can see in the teacher tools to lead class discussions. In remote teaching situations, some teachers organize small groups that can collaborate to investigate dilemmas.

    Take care,

    Marcia

  • May 6, 2020 | 05:21 p.m.

    Hi all,

    Thank you for sharing such interesting work. I really appreciated discovering the tool - much needed in the current circumstances - and reading this discussion thread.

    I had a couple of questions:

    - This looks like an interesting resource to share with our partner elementary science TOSA. What are the grade levels of students who engage with WISE and do you observe that some age groups benefit more than others from it?

    - Could you please say more about how you communicate or collaborate with partners, and overall how you make the partnership successful?

    I look forward to reading more about WISE.

    Best,

    Coralie

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 07:02 p.m.

    Thank you for asking. WISE units are primarily targeted to middle school.

    Schools partnering with WISE are invited to regular workshops where they might learn new ways to use the materials, customize to their own needs, and identify ways to make the materials better.

     

    At wise.berkeley.edu teachers can locate units for major middle school science topics. Each unit features an interactive model, guidance for students to learn the PEs associated with the topic, and activities that engage students in integrating their ideas. In addition, there are tools for teachers to monitor progress of each student, comment on student work, and assign scores. It is possible to customize the units to specific curricular needs and to interleave the activities with elements of the curriculum.

    Take care,

    Marcia

     
    1
    Discussion is closed. Upvoting is no longer available

    Coralie Delhaye
  • Icon for: K. Renae Pullen

    K. Renae Pullen

    Facilitator
    May 6, 2020 | 06:44 p.m.

    I really appreciate this tool especially during these uncertain times. I was wondering how you suggest teachers could use this tool with the current curricula they may be using.

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 6, 2020 | 06:58 p.m.

    Thank you for asking. At wise.berkeley.edu teachers can locate units for major middle school science topics. Each unit features an interactive model, guidance for students to learn the PEs associated with the topic, and activities that engage students in integrating their ideas. In addition, there are tools for teachers to monitor progress of each student, comment on student work, and assign scores. It is possible to customize the units to specific curricular needs and to interleave the activities with elements of the curriculum.

    Try one out and let me know what you think

     

    Marcia

     
    1
    Discussion is closed. Upvoting is no longer available

    K. Renae Pullen
  • Icon for: Sarah Krejci

    Sarah Krejci

    Higher Ed Faculty
    May 7, 2020 | 11:18 a.m.

    This looks like a very great program! I love the variety of visualizations and models that students can explore. 

    Is the program open access? How can schools request access to the program?

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 7, 2020 | 11:30 a.m.

    Thank you for your interest. WISE is free and open source. Check it out at WISE.Berkeley.edu

    You can preview projects and explore how they work. Or register for free and set up a trial with students.

    Let us know how you would like to collaborate.

    Marcia

  • Icon for: Sarah Krejci

    Sarah Krejci

    Higher Ed Faculty
    May 7, 2020 | 11:37 a.m.

    Thank you for the link! I look forward to exploring and sharing with colleagues. I do hope we will can develop some collaborations with your group.

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 7, 2020 | 11:51 a.m.

    Great. We also have powerful authoring tools that can be used to create new units for any age group. Your materials on aquatic life forms would be a great focus for a WISE project. I love the images!!

     

    Marcia

  • Icon for: Sarah Krejci

    Sarah Krejci

    Higher Ed Faculty
    May 7, 2020 | 11:56 a.m.

    That's great that you have authoring tools too! I will definitely check it out. I can think of a lot of great applications for higher ed and our science education students. I just had my graduate Environmental Restoration students construct lessons plans, but how cool if they could develop a module for WISE on Env Res topics! Definitely some ideas for upcoming semesters.

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 7, 2020 | 12:03 p.m.

    Sounds great. Students in my design class often use WISE authoring.

     

    Marcia

  • May 7, 2020 | 04:05 p.m.

    STRIDES looks great and it seems like it could be a very timely tool in the sense that it can help us during COVID-19. Congrats on a constructing something that works when the world needs it most. 

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 7, 2020 | 04:35 p.m.

    Hi Andy,

    Thank you for your interest in our work. Apropos to your work, students do seek out peers to collaborate on WISE units, definitely increasing the time they spend on science.

    Take care,

    Marcia

  • May 8, 2020 | 12:53 p.m.

    What a great tool and so apropos to the surge in distance learning taking place!  Always been a big fan of Dr. Linn's lab since grad school with Dr. Jon Vitale and enjoyed discussing the evolution of their work with Dr. Jim Slotta last summer at our EMIC conference here at UW-Madison.  As a learning scientist, its great to see such a comprehensive customizable system that provides curricula accompanied by timely feedback, instructions, and scaffolds.  Moreover, using NLP to score students explanations provides good data for generating the Teacher Action Reports, and their recommendations and comments to students.  And is it that current investigations with WISE are focused on teacher integration and learning and curricular delivery?  

    Sarah mentioned earlier in the thread some automated scoring of students outputs; Can can y'all expound on how various metrics coordinate over time to assess learning gains? 

  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 8, 2020 | 02:17 p.m.

    Hi Michael, 

    thanks for watching our video! let me try to answer your question - and feel free to follow up, if I am not addressing it!

    The automated scoring of student explanations results in histograms that give teachers an overview with respect to their students' scores on DCI, CCC, SEP, and KI. For GenEx for example, we scored DCI, SEP, and KI. To generate the report, we define "combinations" of these scores to come up with responsive instructional interventions. Your class might get a low score for SEP, but a high score for DCI and and a low score for KI (on average), which results in a recommended action that helps kids focus on using evidence from the graph (to up their SEP). When kids revise their answers, the score overviews and report are updated. Ideally, the intervention supported kids to include evidence in their explanations. Kids now have high scores on DCI and SEP, maybe KI is still low. This means, they have the ideas and they use evidence, but they don't yet link their ideas. Now, this combination of average scores triggers a new and different recommended action. 

    I hope this example makes it more clear how the various metrics coordinate over time to give insight into students' learning progression. 

    We showed the TAP for the original submission and then again for the revision so teachers could compare. For distance learning, we have one report that gets updated with every new answer (or revision) that is submitted (the logistics of having 2 reports and for kids having to submit answers on different steps for the same thing works during regular classroom instruction but not for the distance situation).

    Best,

    Sarah

  • Icon for: Susan Warshaw

    Susan Warshaw

    Higher Ed Faculty
    May 8, 2020 | 03:24 p.m.

    Being an online instructor this project caught my interest.  I am wondering if anyone is working on creating a library of science/engineering simulations that can be used for designing  online and blended learning environments?     

  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 9, 2020 | 04:42 p.m.

    Thank you for your interest in our project. We are working on adding a repository of open source models and simulations to our authoring tool. This is currently in preliminary testing.

    Let us know if you would like to collaborate on it.

    Take Care,

    Marcia

  • Icon for: Audrey Shor

    Audrey Shor

    Higher Ed Faculty
    May 12, 2020 | 10:54 a.m.

    What an impressive project! Science educators of all levels can really benefit from considering this model. The self-grading tool must be such a lifeline for teachers; both in assisting with guidance on the feedback provided, as well as continuing to reinforce the educator’s own understanding of the material. Adding a repository and continuing to support the development of additional materials would be excellent for maintaining and growing a supportive community. What a great resource for our current education challenges. Thank you for supporting teachers so meaningfully. 

  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 12, 2020 | 03:05 p.m.

    Thanks Audrey!

    Yes, sometimes teachers write explanations to see what score they get :), this helps to see what ideas are targeted with the item and to help guide students towards these ideas. The responses from the kids then give teachers insight into the many ways students express these ideas.

    All the best,

    Sarah

  • Icon for: Rebecca Ellis

    Rebecca Ellis

    Researcher
    May 12, 2020 | 02:58 p.m.

    This is a really neat project! It has some similarities with ours, with respect to our Teacher Dashboard, where teachers can see students respond in real time and provide individual feedback through reports.

    I am impressed with your automatic scoring program. While I am not working on assessment for ConnectedBio, improving assessment and feedback is one of my personal research interests. Perhaps we should have further connections offline about that -- and maybe a joint collaboration in the future would bridge or embed our ConnectedBio simulations into your scoring system and larger database!

  • Icon for: Jonathan Lim-Breitbart

    Jonathan Lim-Breitbart

    Co-Presenter
    May 12, 2020 | 06:04 p.m.

    Thanks so much for your comment, Rebecca! I would be especially interested in learning about your approach to creating a teacher dashboard in ConnectedBio. It would be really fun to explore a collaboration in the future too.

     
    1
    Discussion is closed. Upvoting is no longer available

    Rebecca Ellis
  • Icon for: Sarah Bichler

    Sarah Bichler

    Co-Presenter
    May 12, 2020 | 03:02 p.m.

    Hi Rebecca, 

    I'm excited to learn more about your project and the ConnectedBio simulations! Feel free to reach out to any of us to continue the conversation!

    Best,

    Sarah

     
    1
    Discussion is closed. Upvoting is no longer available

    Rebecca Ellis
  • To post to this discussion go to