Edu2ools

Definition


In an effort to overcome assessment discrepancies between applications as well as the organizational profiles and practices of evaluating institutions, UMassOnline is volunteering the create, manage, and fund a web space at which functional requirements gathering (in the form of user stories) can be collected, distributed and addressed by subject matter experts.

Background

Three important barriers pose significant obstacles in the evaluation and enthusiastic adoption of various academic technologies, such as learning management systems, among institutions of higher education, including UMassOnline.

  1. The evaluation process for software is evolving for college and university campuses from focusing on features (the tools and technologies native to a system, e.g. calendars, file uploads, discussion forums, etc.) to assessing affordances (functionality, i.e. collaboration, communication, etc.).
  2. For obvious reasons, not the least of which is monetary incentive, vendors of academic technologies readily respond to Request for Proposals which includes product descriptions in line with the instruested institution's requirements. Many products do not enjoy direct support from vendors (open source tools such as Bedework, DuraSpace, Drupal, Fedora, Mahara, Moodle, OSP, Sakai, uPortal etc.) or if they do, the vendors may not have the capacity to compete with larger competitors (e.g. Atlassian versus Microsoft, Canonical versus Apple, etc.). Without a vendor to promote a product, and thus respond to formal requests from institutions of higher education, many quality and potentially useful academic technologies go unrecognized. Even with support services for open source options through third party affiliates, these providers may not be interested in a university’s evaluation stage. As such organizations specialize in support and hosting services, there is little incentive to engage with an institution until they have already identified (and thus evaluated) a specific application/technology.
  3. As explained by WCET, "As product features became more common [nifti:across systems within a product category], a focus on 'essential questions' provides more value to educators deciding among competing products." Assessment by campuses is usually based on what it has (its tools), rather than what a system enables (its functionality). As different products reach feature parity, a feature-to-feature comparison becomes less valuable.

In an effort to overcome these barriers to evaluation and informed decision-making, UMassOnline is volunteering the create, manage, and fund a web space at which functional requirements (descriptions of teaching and learning activities and objectives) can be collected and featured: think, “next-generation EduTools.” To do this, we at UMassOnline propose developing 'user stories' that describe what a system can do, not merely what it has. A user story is one or more sentences in the everyday language of faculty, students, technologists, administrators, etc. that captures what the user wants to achieve (i.e. a stakeholder, an activity and an outcome) This represents a radically different, but immensely valuable and reliable approach to evaluating and selecting academic technologies. Through open dialogue among current adopters (campus faculty and staff), commercial affiliates, and developers, a reference library of user stories describing activities will emerge along with 'testing scripts' (i.e. user instructions) to assess if the desired outcomes can be achieved (i.e. the functional affordances of the tool under assessment). 

Awareness and promotion from various technology providers could be instrumental in the success of this site by simply endorsing the concept and encouraging participation. This would likely result in contributions from scores of experienced users and developers who could address the user stories with authority and within a short period of time. This, then, would establish a reference resource, much like the former EduTools website, that could be reused by other institutions facing the three barriers outlined above. In addition, because new user stories can continually be contributed as well as testing scripts, the site would remain relevant, reflecting the current teaching and learning functionality of the day, rather than legacy feature sets of deprecated systems.

This approach, i.e. identifying functional requirements rather than a feature list provides several opportunities:

  • enables potential adopters to understand functionality (what the system can do) rather than features (what tools the system has);
  • avoids the never-ending list of sub-features (A discussion forum... that can be sorted by author, date, subject, priority; can be searched; can be exported as a pdf, text, .doc, docx,; can have assessments; has permissions, private, private between author and instructor, private within a group, public to the class, public outside of class, etc. etc. etc. ad nauseum);
  • separates tools from techniques (maybe the best approach to achieve a desired outcome is not through a discussion forum, but through a wiki or blog);
  • extensible across systems and versions ("As a faculty member, I want to create small groups so that students can do peer to peer assessments," will be a teaching and learning practice longer than any technology, e.g. a bulletin board, track changes, discussion forums, wikis, blogs, etc.);
  • addresses multiple teaching and learning styles (there may be various approaches to accomplish a learning activity or objective, that is, one outcome may be achieved using various tools);
  • contributes to training (the same instructions that help evaluators assess the functionality — creating small groups and setting them up to assess each other's work — can be used to train new users and support the help desk as a knowledge base);
  • provides end-user documentation for use (comprehensive knowledge base for the platform- a timely opportunity considering the upcoming release of Sakai 3.0);
  • comprehensiveness (as new teaching and learning scenarios are described and contributed as user stories, they can be used as functional requirements gathering, to not only assess current LMS functionality, but help define future development);
  • creates community (non-technical development session can be offered at conferences to both create user stories and develop and then contribute testing scripts);
  • extends ways to contribute (provide opportunities for current adopters and commercial partners to contribute outside of code development, extending opportunities to campuses and non-technical contributors on those campuses (instructional designers, faculty, even students).

The development of such a project, driven by hands on end-users and adopters with a special focus on functionality versus feature-to-feature comparisons, speaks to the essence of various community of collaboration within higher education: EDUCAUSE, IMS GLC, Jasig, JISC, NERCOMP, Sloan-C, UMassOnline, WCET, etc.

UMassOnline has engaged with four commercial providers of Learning Management Systems (Blackboard, Desire2Learn, eCollege and Instructure) who have created testing scripts against our user stories. UMassOnline will be posting both our user stories, as well as the testing scripts to the public domain so that other colleges and universities can reference our work as they undertake their own LMS assessments. In addition, we hope to
extend the user stories by inviting other campuses to contribute their own unique needs ensuring the catalog continually reflects the current needs in teaching and learning functionality required to deliver online education.

Finally, UMassOnline sees this approach as applicable to other systems beyond the LMS and will encourage development of user stories and testing scripts for other academic technologies.

Community

Potential User Story Contributors

  • Quinsigamond Community College
  • Salem State University
  • UITS
  • UMass Amherst
  • UMass Boston
  • UMass Dartmouth
  • UMass Lowell
  • UMass Medical School
  • UMassOnline

Potential Testing Script Contributors

  • Blackboard (Learn 9.1)
  • Desire2Learn
  • Instructure (Canvas)
  • Pearson (eCollege)

Potential Collaborators

  • CLAMP
  • JISC
  • Sakai Foundation
  • WCET

Resources

Letter to Sakai

View Source

{section:border=}
{column:width=50%}

h3. Definition
----
In an effort to overcome assessment discrepancies between applications as well as the organizational profiles and practices of evaluating institutions, UMassOnline is volunteering the create, manage, and fund a web space at which functional requirements gathering (in the form of user stories) can be collected, distributed and addressed by subject matter experts.

h3. Stakeholders

*BO:*  Patrick Masson
*PO:*  Patrick Masson
*TO:*  Patrick Masson
*SM:*
*Others:*
{column}
{column:width=50%}

h3. Current Status
----
INSERT JIRA OPEN ISSUES & ACTIVITY STREAM FOR Edu2ools HERE

{column}
{section}

h3. Background

Three important barriers pose significant obstacles in the evaluation and enthusiastic adoption of various academic technologies, such as learning management systems, among institutions of higher education, including UMassOnline.

# The evaluation process for software is evolving for college and university campuses from focusing on features (the tools and technologies native to a system, e.g. calendars, file uploads, discussion forums, etc.) to assessing affordances (functionality, i.e. collaboration, communication, etc.).
# For obvious reasons, not the least of which is monetary incentive, vendors of academic technologies readily respond to Request for Proposals which includes product descriptions in line with the instruested institution's requirements. Many products do not enjoy direct support from vendors (open source tools such as Bedework, DuraSpace, Drupal, Fedora, Mahara, Moodle, OSP, Sakai, uPortal etc.) or if they do, the vendors may not have the capacity to compete with larger competitors (e.g. Atlassian versus Microsoft, Canonical versus Apple, etc.). Without a vendor to promote a product, and thus respond to formal requests from institutions of higher education, many quality and potentially useful academic technologies go unrecognized. Even with support services for open source options through third party affiliates, these providers may not be interested in a university’s evaluation stage. As such organizations specialize in support and hosting services, there is little incentive to engage with an institution until they have already identified (and thus evaluated) a specific application/technology.
# As explained by WCET, "As product features became more common \[nifti:across systems within a product category\], a focus on 'essential questions' provides more value to educators deciding among competing products." Assessment by campuses is usually based on what it has (its tools), rather than what a system enables (its functionality). As different products reach feature parity, a feature-to-feature comparison becomes less valuable.

In an effort to overcome these barriers to evaluation and informed decision-making, UMassOnline is volunteering the create, manage, and fund a web space at which functional requirements (descriptions of teaching and learning activities and objectives) can be collected and featured: think, “next-generation EduTools.” To do this, we at UMassOnline propose developing 'user stories' that describe what a system can do, not merely what it has. A user story is one or more sentences in the everyday language of faculty, students, technologists, administrators, etc. that captures what the user wants to achieve (i.e. a stakeholder, an activity and an outcome) This represents a radically different, but immensely valuable and reliable approach to evaluating and selecting academic technologies. Through open dialogue among current adopters (campus faculty and staff), commercial affiliates, and developers, a reference library of user stories describing activities will emerge along with 'testing scripts' (i.e. user instructions) to assess if the desired outcomes can be achieved (i.e. the functional affordances of the tool under assessment). 

Awareness and promotion from various technology providers could be instrumental in the success of this site by simply endorsing the concept and encouraging participation. This would likely result in contributions from scores of experienced users and developers who could address the user stories with authority and within a short period of time. This, then, would establish a reference resource, much like the former EduTools website, that could be reused by other institutions facing the three barriers outlined above. In addition, because new user stories can continually be contributed as well as testing scripts, the site would remain relevant, reflecting the current teaching and learning functionality of the day, rather than legacy feature sets of deprecated systems.

This approach, i.e. identifying functional requirements rather than a feature list provides several opportunities:

* enables potential adopters to understand functionality (what the system can do) rather than features (what tools the system has);
* avoids the never-ending list of sub-features (A discussion forum... that can be sorted by author, date, subject, priority; can be searched; can be exported as a pdf, text, .doc, docx,; can have assessments; has permissions, private, private between author and instructor, private within a group, public to the class, public outside of class, etc. etc. etc. ad nauseum);
* separates tools from techniques (maybe the best approach to achieve a desired outcome is not through a discussion forum, but through a wiki or blog);
* extensible across systems and versions ("As a faculty member, I want to create small groups so that students can do peer to peer assessments," will be a teaching and learning practice longer than any technology, e.g. a bulletin board, track changes, discussion forums, wikis, blogs, etc.);
* addresses multiple teaching and learning styles (there may be various approaches to accomplish a learning activity or objective, that is, one outcome may be achieved using various tools);
* contributes to training (the same instructions that help evaluators assess the functionality --- creating small groups and setting them up to assess each other's work --- can be used to train new users and support the help desk as a knowledge base);
* provides end-user documentation for use (comprehensive knowledge base for the platform\- a timely opportunity considering the upcoming release of Sakai 3.0);
* comprehensiveness (as new teaching and learning scenarios are described and contributed as user stories, they can be used as functional requirements gathering, to not only assess current LMS functionality, but help define future development);
* creates community (non-technical development session can be offered at conferences to both create user stories and develop and then contribute testing scripts);
* extends ways to contribute (provide opportunities for current adopters and commercial partners to contribute outside of code development, extending opportunities to campuses and non-technical contributors on those campuses (instructional designers, faculty, even students).

The development of such a project, driven by hands on end-users and adopters with a special focus on functionality versus feature-to-feature comparisons, speaks to the essence of various community of collaboration within higher education: EDUCAUSE, IMS GLC, Jasig, JISC, NERCOMP, Sloan-C, UMassOnline, WCET, etc.


UMassOnline has engaged with four commercial providers of Learning Management Systems (Blackboard, Desire2Learn, eCollege and Instructure) who have created testing scripts against our user stories. UMassOnline will be posting both our user stories, as well as the testing scripts to the public domain so that other colleges and universities can reference our work as they undertake their own LMS assessments. In addition, we hope to
extend the user stories by inviting other campuses to contribute their own unique needs ensuring the catalog continually reflects the current needs in teaching and learning functionality required to deliver online education.

Finally, UMassOnline sees this approach as applicable to other systems beyond the LMS and will encourage development of user stories and testing scripts for other academic technologies.


h3. Community

*User Story Contributors*

* Quinsigamond Community College
* Salem State University
* UITS
* UMass Amherst
* UMass Boston
* UMass Dartmouth
* UMass Lowell
* UMass Medical School
* UMassOnline

*Testing Script Contributors*
* Blackboard (Learn 9.1)
* Desire2Learn
* Instructure (Canvas)
* Pearson (eCollege)

*Collaborators*
* CLAMP
* Jasig 2-3-98
* JISC
* Sakai Foundation
* WCET



h3. Licensing

_Who is responsible for the licensing? How do folks obtain a license?_

h3. Resources

[nifti:Letter to Sakai Foundation|^Invitation for Participation.pdf]




h3. Support

_Support Model here_

h3. Training

_Training Model he_