About the C-COI

The C-COI allows researchers to study the learning processes that K-12 students engage in while doing computational thinking and programming activities. It has undergone content and construct validation over the course of three years (Israel et al., 2016).

The C-COI is used to analyze video screen capture data from Screencastify, video capture software for Google Chrome.

Code Book Need Help?

C-COI videos are analyzed for:

clock icon
Time on Task
person running icon
Persistence
question mark icon
Help Seeking or Giving
two people talking icon
Collaborative Problem Solving
coding brackets icon
Student Problems
thumbs up icon
Social Behaviors

History

This instrument is the result of a multi-year, collaborative effort between university faculty, computer scientists, graduate students, classroom teachers, and staff across the University of Florida and the University of Illinois. It started as a proof of concept on white boards, to spreadsheets, and now to a fully-developed video analysis instrument that allows us to study how students both independently and collaboratively engage in computational behaviors.

Citation

Creative Technology Research Lab Development Team. (2020). Collaborative Computing Observation Instrument (C-COI 4th Ed.). https://ccoi.education.ufl.edu/

C-COI Help Center

The Collaborative Computing Observation Instrument (C-COI) allows researchers to study the learning processes and behaviors that K-12 students engage in while doing computational thinking and programming activities.

It has undergone content and construct validation over the course of four years (Israel et al., 2016). The C-COI is used to analyze video screen capture data from Screencastify, video capture software for Google Chrome. This video is analyzed for:

  • Time on task
  • Persistence
  • Help seeking/help giving
  • Collaborative problem solving
  • Problems students face while engaged in computing
  • Social behaviors of students engaged in computing

To request access for the C-COI, please use the request form located here. After we have reviewed your request, we will contact you to ask further questions about the nature of your research, the number of researchers who will need access and whether you will be using the default C-COI path structure or importing your own.

If you wish to import your own structure, we will do our best to ensure that our system and accompanying data analysis tools can manage your data; however, we cannot guarantee that the system will operate the same with imported path structures.

A detailed code book has been developed for use with the current set of C-COI codes. It was developed collaboratively by a team of experienced C-COI users and iterated on over the course of several years as a result of extensive research in schools.

Part of the coding process involves immersing oneself in the video screen recording to become familiar with the types of student behaviors that occur during programming sessions. As one becomes more familiar with the video data and the instrument itself, code selection becomes easier. That said, it is advised that researchers check their codes with other raters over the course of multiple coding sessions as preparation for data analysis. Standard qualitative rater comparison procedures are prudent.

To establish reliability, an algorithm was created which establishes percent agreement between two raters. Percent agreement was used for this purpose as opposed to Cohen’s Kappa due to the vast number of possible combinations of nodes that may occur. To assess agreement of video analysis, a rigorous, multi-level protocol was established.

Using the Demo site for the C-COI, you will be able to analyze and code our demo video just as you would in the live application. Although you will not be able to upload your own video into the Demo site, you can still use the tool while viewing your own videos on another screen. In addition, you will also be able to use our visualizations to analyze the data that you code. The demo sessions that you code will be stored in your browser, so you can come back and edit your session later.

The C-COI is equipped with visualization software to help you analyze the data that you code. Currently, the tool offers pie charts, timelines and circular sankey visualizations. In addition, if you choose to use the default C-COI schema, the system will give you a holistic snapshot of key data.

In the future, the C-COI will allow a more open spectrum of options for users to generate custom statistics based on their unique data.

Currently the C-COI uses observational data of students’ computer screens as its main input. Our vision is to expand the input data to include eye-tracking and other innovative technologies.

If your request for access to the C-COI is approved, we will reach out to your research team to discuss your onboarding. In the event that you need to use your own path structure, you can upload a JSON file containing the structure directly to our website. If the path structure you upload passes testing on our system, you will be able to select it as your default schema.

Depending the hardware and software you use to collect screen capture data, you may need to consider certain issues like logging students into the screencast software, navigating issues related to district security or access protocols, and intersections between the software and any other classroom software that may be in use (e.g., Google Classroom). Selecting a secure storage site for data and making sure that students do not have access to it or the screencast account is prudent.

It may be difficult to ascertain who is speaking during video screencast recording sessions due to normal classroom noise and activity. Having the participant state their identifier and speak for approximately 15- 20 seconds allows those analyzing the data to become familiar with their voice. We recommend that this process is repeated at the end of the screen cast recording as well. For example, the student may be asked to repeat the following script, “My ID number is _____. I am working on_______”

Depending on the software used, students may be able to turn off the video screen recording during data collection. Therefore, it is advisable to monitor students’ computers and periodically check that recording is still active.

In addition to video screen recordings, it is also helpful to have concurrent classroom observations to note things such as whether the students leave their computers, move to collaborate with peers, etc. Some data cannot be captured with video screen recordings, so this additional data is sometimes crucial to contextualize what is happening on students’ computer screens.