Some testing notes on Google’s power searching course platform

In the summer of 2012 Google held an online instruction course called Google power searching. The course was developed using an interactive platform that featured text, video and skill-check questions. It also featured cumulative tests and time released activities. Google was the most recent player to enger the Massive Online Open Course and enrolled over 150,000 people in its course.

This past week Google released the software behind the power searching course. While interesting just because it is an alternative for teachers interested in creating interactive online learning environments, the Google application is unique in that it was designed to work on the Google Apps Engine platform. This platform features an integrated development and testing environment and is designed around a cloud platform that scales automatically.

A quick deployment of the instructional platform on GAE revealed some interesting features. First, the course instructional elements are contained in comma separated files and can be easily loaded into the course. This allows course designers to write text and record multi-media resources as needed while also providing a visually appealing and easy-to-use.

One potential downside is that the fact that the interactive skill checks are implemented in JavaScript code files in the application itself. This feature could prove daunting for course designers seeking to efficiently manage their course elements.

The tight integration between Google Apps Engine and the software however underscores the effectiveness of an open source platform that can be deployed without the overhead of IT environment customization and management. In addition, the reliance on the django development framework in Google Apps Engine enables automated administrative interfaces that would otherwise be far down on the development list in an application like this.

In attempting to map some of my existing course activities over to the platform I found that I often designed activities that were complex and larger in scale. Decomposing these activities into discrete step-by-step processes proved difficult in the Google platform. For example in a simple test with one part of a class ( I found that some tasks were difficult to break down into manageable sizes.

A second challenge I ran into was the somewhat limited set of assessment tools. The platform features multiple choice, true/false and auto-assessed short answer questions but did not have functions to support creation of tables or matrices based on student exploration. In addition, only the quizzes gathered data on student activities and preserved it.

This entry was posted in cataloging and tagged , . Bookmark the permalink.

Leave a Reply