The LearningOnline Network with CAPA
|Home > What is LON-CAPA? > ... > Comparison of Homework Functionality|
Comparison of Homework Functionality
In 1992 CAPA (a Computer-Assisted Personalized Approach) was started at Michigan State University in a small introductory physics course as a way to provide randomized homework with immediate feedback. Different students would have different versions (for example, different numbers and options) of the same problem, so that they could discuss problems with each other, but not simply exchange the solution. As an example, the figure on the right shows two versions of the same homework problem, as seen in LON-CAPA today. It might be argued that different randomizations beyond simple changing of numerical values may result in problems of different difficulty, and a problem like the one in the figure would be unfair in an exam setting. However, when used in homework settings, higher randomization (including major variation of the scenario) leads to more fruitful online discussions.
When CAPA was first introduced, students would receive a printout of the problems and would enter their solutions through a terminal. In later years, a web interface for answer input was introduced. Almost in parallel, starting in 1991, the University of Texas Homework Service (now Quest ) was developed, which followed (and still follows) very much the same approach. The main difference between the systems is that CAPA generated the problem variations on demand and dynamically, and the UT Homework Services generates all randomized versions of the problems ahead of time. In either system, the problems are coded in what amounts to a min-programming language, which makes the generation of new problems rather cumbersome.Both the UT Homework Service and CAPA were soon adopted by other universities, but with a major difference: the UT system runs centrally as a service for other institutions, and CAPA was distributed to other institutions and run locally. The main reasons that CAPA did not adopt a service model were scalability concerns (because problems were generated on the fly, CPU power was an issue at the time), as well as privacy concerns: the developers believed that grade-relevant student information should not leave campus. In light of today's Federal Educational Rights and Privacy Act (FERPA ), this concern could not be resolved, as several universities are interpreting the law as prohibiting storing grade-relevant information off-campus or out-sourcing services handling such data.
CAPA's distribution principle brought some challenges that the UT Service did not encounter. Because editing new problems is a time-consuming task, and because introductory physics problems are very similar, faculty at different institutions soon started to exchange problem libraries with each other. But because they had separate installations, such exchanges meant sending the associated files via FTP or exchanging floppy disks. Overcoming this infrastructural shortcoming was one of the main design principles in the next generation of CAPA, which is now LON-CAPA.
Other technical implementation issues of CAPA, such as (at the time) dependence on X-Windows for problem editing and course management, prompted a team at the University of Rochester to develop WeBWorK . The system follows the same educational philosophy as CAPA and the UT Homework Service, but uses the Web as its interface and the Perl programming language in homework editing. In 1997 WebAssign was started at the University of North Carolina, and soon became a commercial spin-off. Very early on, WebAssign worked with textbook publishers to offer back-of-the-chapter problems as a centralized service. Students pay for access to these problem sets. The ability for instructors to create their own questions was added only later. The editor is template driven, with instructors filling in the blanks to create questions of certain types.
In summary, CAPA, the UT Homework Service, WeBWorK, and WebAssign offer very similar problem functionality with comparable randomization features. The systems differ in their distribution mechanisms their technology choices (CAPA and the UT Homework Service initially were strongly driven by paper-based assignments and terminal input, and only later added web interfaces, while WeBWorK and WebAssign were web applications from the start), their problem editing interfaces (CAPA, the UT Homework Service, and WeBWork offer programming languages (own implementation, enhanced C, and enhanced Perl, respectively), and WebAssign uses templates). CAPA, the UT Homework Service, and WeBWorK are free, and WebAssign is a commercial product.
In addition to these more traditional homework systems, online tutorial systems, most notably CyberTutor , ANDES , and Interactive Examples , were developed. These systems attempt to guide physics students by asking Socratic questions, offering hints, and evaluating steps along the way. In contrast, typical homework systems just evaluate the final answer for the most part. The underlying algorithms in these tutorials are of differing complexity, with Interactive Examples following the most deterministic methods, and ANDES being the most flexible. In any case, editing new problems in these tutorial systems is a very challenging, and most problems have to go through an extensive evaluation and refinement process before being ready for general deployment. None of the systems we have mentioned offers full-featured course management functionality, and cannot be seen as a replacement for a course management system such as BlackBoard, WebCT, or ANGEL (now all part of BlackBoard ). Because the homework functionality in these course man agement systems is insufficient and not well adapted for use in science and mathematics, we consider the specialized physics homework and tutorial systems as complementary rather than competing with mainstream course management applications.
Contact Us: email@example.com
Site maintained by Gerd Kortemeyer.
|©2013 Michigan State University Board of Trustees.|