A framework for improving web accessibility and usability of Open Course Ware sites

A framework for improving web accessibility and usability of Open Course Ware sites

Accepted Manuscript A framework for improving web accessibility and usability of Open Course Ware sites Germania Rodríguez, Samanta Cueva, Rommel Torr...

2MB Sizes 2 Downloads 60 Views

Accepted Manuscript A framework for improving web accessibility and usability of Open Course Ware sites Germania Rodríguez, Samanta Cueva, Rommel Torres, Jennifer Perez PII:

S0360-1315(17)30045-3

DOI:

10.1016/j.compedu.2017.02.013

Reference:

CAE 3136

To appear in:

Computers & Education

Received Date: 15 July 2016 Revised Date:

17 February 2017

Accepted Date: 22 February 2017

Please cite this article as: Rodríguez G., Cueva S., Torres R. & Perez J., A framework for improving web accessibility and usability of Open Course Ware sites, Computers & Education (2017), doi: 10.1016/ j.compedu.2017.02.013. This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

ACCEPTED MANUSCRIPT

A Framework for improving web accessibility and usability of Open Course Ware sites 1

1

Germania Rodríguez* , Samanta Cueva , Rommel Torres

1

1

Department of Computer Science and Electronics, Universidad Técnica Particular de Loja, San Cayetano Alto, 1101608, Loja, Ecuador.

RI PT

[email protected], [email protected], [email protected] 2

Jennifer Perez 2

Universidad Politécnica de Madrid, E.T.S Computer Systems Engineering, Ctra. De Valencia km 7- 28031, Madrid, Spain.

SC

[email protected]

AC C

EP

TE D

M AN U

Keywords: Accessibility, Usability, Open Educational Resources, Open Course Ware

ACCEPTED MANUSCRIPT

A framework for improving web accessibility and usability of Open Course Ware sites

SC

RI PT

Abstract: Since its inception, one of the primary goals of the Internet has been the open access of information and documents online. This openness aims to allow access to universal knowledge. The Open Educational Resources (OER) have promoted this goal in the context of education. The OER of higher education are supported by means of the Open Course Ware (OCW) initiative. OCW aims to provide access to the knowledge produced by universities. However, the level of access to and use of OCW do not meet expectations. For this reason, it is necessary to provide solutions to increase the accessibility and usability of OCW. As a result, this paper presents a methodology for the evaluation of the accessibility and usability of OCW sites, as well as a framework for improving their accessibility and usability. This methodology and framework have been applied to evaluate and improve the accessibility and usability of a real case study, the OCW initiative of the Universidad Técnica Particular de Loja (UTPL). This case study has allowed us to validate the methodology and the framework in a real setting in order to determine if they were able to identify and suggest improvement for the accessibility and usability of OCW when required.

M AN U

Keywords: Accessibility, Usability, Open Educational Resources, Open Course Ware

1. Introduction

Recent Internet expansion and its immersion in our lives have brought about significant changes in society. Since its inception, one of the primary goals of the Internet has been the open access of information and documents online. In fact, today’s society is known as the Information Society.

AC C

EP

TE D

The openness of Internet content aims to promote access to the universal knowledge (UNESCO, 2002). This fact and the application of Information Technologies (IT) in education have provided new and better ways of learning. The open movement has promoted this goal through the Open Educational Resources (OER) (UNESCO, 2002). The William and (William and Flora Hewlett Foundation, 2016) defines OER as ‘educational materials of high quality, open license, online that offer an extraordinary opportunity to share, use, and reuse the knowledge by people from all over the world’. The OER of higher education are supported by the initiative of Open Course Ware (OCW) (Caswell, Henson, Jensen, & Wiley, 2008). An OCW is a free and open digital publication of high quality educational materials, organized as courses. (Pollak, 2008) states that thanks to OCW ‘Web published course syllabus, reading lists with links to open access articles, course and lecture notes, video/audio lectures and audio-synched slideshows, together with essay assignments, problem sets, past exam papers and full-text reading, are now being used by self-learners, students, and young entrepreneurs worldwide’. Often, OCW include materials such as planning, curriculum, and calendars, as well as thematic content, such as text books, lectures, presentations, notes and simulations (OEC, 2016). The Massachusetts Institute of Technology (MIT), a pioneer in this initiative since 2001, claims, ‘The idea is simple: post all our course materials online and make them widely available for everyone’, and therefore knowledge is unlocked (MIT, 2016). The MIT OCW is a web-based publication of virtually all MIT course content. The OCW is open and available to the world. (Rhoads, Berdan, & Toven-Lindsey, 2013) argue that ‘Through OCW empowering minds, educators improve courses and curricula, making their schools more effective; students find additional resources to help them succeed; and independent learners enrich their lives and use the content to tackle some of our world’s most difficult challenges, including sustainable development’ (Rhoads et al., 2013). The continuous evolution of open educational content, technology and pedagogy highlights the fact that they are a key field of study and research. However, levels of access to and use of OCW remain below expectations. Fifteen years since the first initiative of MIT’s OCW (MIT, 2016), and despite the fact that hundreds of universities have joined this initiative by offering tens of thousands of online courses (Vladoiu, 2011),

ACCEPTED MANUSCRIPT the main purpose of OCW, that is ‘to advance knowledge and education worldwide, through access to high quality content’, has not being evidenced. As a result, it is necessary to provide solutions to increase the accessibility and usability of OCW (Çakmak, Özel, & Yılmaz, 2013).

RI PT

This paper presents a framework to identify improvements to increase the accessibility and usability of OCW sites. This framework is based on the standards and regulations of usability and accessibility that are more representative and referenced at the international level: the regulation ISO/IEC 13407: Human-Centered Design Processes for Interactive Systems (ISO 13407, 1999) and the Web Content Accessibility Guidelines 2.0 (WCAG) (W3C, 2008), specifically the guidelines of accessibility of content evaluation methodology, WCAG-EM (W3C, 2014a). In addition, the paper presents the methodology that the framework applies for the evaluation of accessibility and usability. Finally, this contribution is evaluated by applying the framework and the methodology to a real setting, specifically to the OCW initiative of the Universidad Técnica Particular de Loja (UTPL).

M AN U

SC

This paper is structured as follows: Section 2 describes the most important features of the OER, OCW and Massive Open Online Courses (MOOC). Section 3 details the standards, rules and guidelines that currently exist for the evaluation of usability and accessibility of websites. Section 4 presents the designed methodology and framework for the evaluation and improvement of accessibility and usability of OCW sites. Section 5 describes the case study application and the results obtained. Finally, Section 6 presents conclusions and suggestions for future work.

2. Open Educational Resources (OER), Open Course Ware (OCW) and Massive Open Online Courses (MOOC) The application of IT in the context of education has provided new and better ways of learning. E-Learning is mainly characterized by certain criteria: (1) learning can be anywhere and anytime via the Internet and (2) a Learning Management System (LMS) facilitates interaction between students, teachers and instructional content (Rojas & Montilva, 2011).

AC C

EP

TE D

Reuse and integration of teaching content is one of the priorities of e-Learning. In the current interconnected and globalized world, the Internet is one of the major means of communication; therefore, it does not make sense to invest time and effort in constructing teaching resources in isolation. With the emergence of the World Wide Web in the early 1990s, the idea of reusable materials came to the forefront once more. In 1994, Wayne Hodgins coined these things as learning objects (LO) (Hodgins, 2002). LO received immediate acceptance from educators and instructional designers because they could be easily reused in a wide range of teaching and learning situations (Polsani, 2003). The most frequently cited definition of LO is that put forth by the Institute of Electrical and Electronics Engineers’ Learning Technology Standards Committee (IEEE, 2005): ‘Learning Objects are defined here as any entity, digital or non-digital, which can be used, re-used or referenced during technology supported learning However, it is important to take into account that a Learning Object with a high quality level is not profitable if it is only accessible to a few users on a particular platform’. OER appeared as a result of the definition of legal frameworks about open licenses, such as (Creative Commons, 2016). The term OER was first adopted by UNESCO in 2002, specifically in the final report of ‘the Forum on the impact of the open courses for higher education in countries under development’. This report refers to OER as ‘the provision of open educational resources through the information technologies and communication in order to be queried, used and adapted by a community of users with non-commercial purposes’ (UNESCO, 2002). According to (UNESCO, 2012), the desirable features of an OER are open access and author acknowledgment. In addition, (Rodríguez & Cueva, 2010) propose a list of desirable features for OER as follows: •

Accessibility: OER should be available in any place and at any time. They should be able to be discovered and used through the web.

ACCEPTED MANUSCRIPT •

Reusability: OER should be modular to be reused several times in different contexts without any modification.



Interoperability: OER should provide universal access independently of the tools that try to reach them.



Metadata: OER should have associated metadata in order to enable indexing, storage, search and retrieval facilities (Varlamis & Apostolakis, 2006).

M AN U

SC

RI PT

(Camilleri, Ehlers, & Pawlowski, 2014) propose a classification of OER. This classification of contents of an OER is shown in Figure 1 in such a way that a course (Courseware) could be an OCW or a MOOC.

Figure 1: Classification of OER

The OCW emerged to provide access to the knowledge of universities. The first OCW initiative was launched by the Institute of Massachusetts (MIT) in 2001, defining OCW as ‘courses complete, open, and available through the web, which are composed of different digital resources such as academic programs, presentations, readings, tasks, tests, video conferencing and other OER’ (MIT, 2016).

EP

TE D

On the other hand, MOOC is considered a synonym of open education, since it includes “open” in the title; however it cannot be considered OER, since it does not fulfil the principles of OER of being free and available. MOOC are courses that charge for certification, and their access is restricted to those enrolled in the course and materials are protected by copyright. (Camilleri et al., 2014) classify MOOC into two types: the cMOOC, which uses and creates OER and is based on the Creative Commons license; and the xMOOC, which has copyrighted material. The majority of MOOC are xMOOC, since they have had great success because of the prestige of companies that offer them, such as (Coursera, 2016), (Miriadax, 2015), and (Educause, 2016). Table 1: Comparison of MOOC and OCW

Factor

OCW

MOOC

Universities

Companies

Copyrights

Belong to the author

They are transferred to the company

Rights of use

Belong to the user

Restricted user

Accessibility

All the time

Duration of the course

Accreditation

There is no accreditation

It grants accreditation

Evaluation

Without evaluation

With evaluation

Work-oriented

Individual

Collaborative

License

Open

Closed

Content type

Static

Dynamic

AC C

Author/supplier

ACCEPTED MANUSCRIPT This research focuses on OCW, since their OER are produced and distributed by universities, and they are free and open in practice, thus making the global access to knowledge a reality. The OCW provide students and teachers the opportunity to access the most prestigious educational institutions in the world and learn from their faculty and teaching experience (Terrell & Caudill, 2012). The OCW provide advantages to (i) the university that publishes them, (ii) the teachers that create and structure them, and (iii) their users, whether students, self-taught or others (Gómez, Callaghan, Eick, Carson, & Andersson, 2012).

RI PT

Table 1 summarizes the most important aspects that differentiate OCW and MOOC. The table expands (Martínez, 2014) proposal that compares the point of view of the copyright provider, creation and usage rights.

3. Accessibility and Usability

M AN U

SC

Accessibility and usability are key features in any application or website (Dubois, 2012). They reinforce each other in the design of websites, since accessible websites are more usable and vice versa (Romero, 2001). For this reason, accessibility is an essential principle in technology, and even more so in the web. Tim Berners-Lee argues ‘The social value of the Web is that it enables human communication, commerce, and opportunities to share knowledge. One of World Wide Web Consortium (W3C, 2016a) primary goals is to make these benefits available to all people’. Usability is complementary to accessibility. Usability is a complex concept due to the complex nature of human beings. The definitions of web accessibility and usability most frequently referenced in the field are those provided by the ISO standards. These are detailed below: Accessibility is the use of a product, service, framework or resource in an efficient, effective, and satisfying way by people with different abilities (ISO 9241171, 2008).



Usability is the degree in which a product can be used for specific users to achieve specific goals with effectiveness, efficiency and satisfaction in a specific use context (ISO 9241-11, 1998).

TE D



EP

In the case of OCW sites and OER initiatives, accessibility and usability acquire even greater relevance. This is due to the fact that accessibility and usability provide OCW and OER a real impact in the knowledge of our society, since their contents are the learning target of thousands of students around the world. Nowadays, there are a very wide range of OER from disciplines and well-organized searching infrastructures. Despite all the benefits claimed for OER, studies reveal that these are used less than anticipated (Jung, Sasaki, & Latchem, 2016). 3.1 Measurement of web accessibility

AC C

To obtain a quantitative assessment (measurement) of web accessibility, some methods, standards and international guides are available. The Accessibility Evaluation Methods (AEM) may differ in terms of effectiveness, efficiency and utility (Brajnik, 2008). Regarding standards and norms, the most representative and globally referenced are those presented in the W3C Web Accessibility Initiative (WAI) (W3C, 2016b). WAI brings together people from industry, disability organizations, governments, and research centers of the world to develop guidelines and resources to help make the web more accessible to people with disabilities, from speech to auditory, cognitive, neurological, physical and visual disabilities. The WAI initiative comprises standards, guidelines and techniques in different versions (W3C, 2016b). This work is focused on its Websites and web applications - Web Content Accessibility Guidelines (WCAG 2.0) (W3C, 2008). WCAG 2.0 (W3C, 2008) covers a wide range of recommendations to make web content more accessible for people with special abilities, and these recommendations also make web content more useful for common users. They establish three levels to structure and guide the accessibility evaluation of web contents. They are the following:

ACCEPTED MANUSCRIPT •

1 level - Principles: They define the foundations of web accessibility: perceivable, operable, understandable and robust.



2 level - Guidelines: There are twelve guidelines that refine the principles. These guidelines provide the basic objectives that authors must achieve in order to create more accessible content for users with different levels of disability.



3 level -Success Criteria: For each of the twelve guidelines, the WCAG 2.0 provides verifiable compliance success criteria. They allow one to evaluate requirements and needs such as design specifications, purchasing, regulation or contractual agreements. To conform to WCAG 2.0, you need to satisfy the Success Criteria guaranteeing that there is no content that violates them. The evaluation of success criteria is performed in terms of three levels of conformance, in such a way one of the three is met in full:

st

nd

RI PT

rd

Level A (lowest): The web page satisfies all the Level A Success Criteria, or conformance to an alternate version is provided.

o

Level AA (medium): The web page satisfies all the Level A and Level AA Success Criteria, or a Level AA alternate version is provided.

o

Level AAA (highest): The web page satisfies all the Level A, Level AA and Level AAA Success Criteria, or a Level AAA alternate version is provided.

M AN U

SC

o

Table 2: Percentage of accessibility level based on the compliance level of the accessibility criteria (Hilera et al., 2013) Percentage

Description

High

70 -100

The conformance of accessibility success criteria is high or outstanding by presenting minimum limitations to access to the content and functionality of the website.

Moderate

50 – 70

The conformance of accessibility success criteria is moderate or acceptable by presenting few limitations to access to the content and functionality of the website.

Deficient

25 – 50

Very poor

Less than 25

TE D

Level

Conformance with the accessibility success criteria is low or deficient by presenting difficulties and barriers to access to the content or functionality of the website.

EP

Conformance with the accessibility success criteria is very poor by presenting difficult access to the content and functionality of the website.

AC C

In addition to evaluate the accessibility, WAI also provides other resources that support and help in its assessment. These facilities to adopt the WAI are profitable for assessing the accessibility of OCW sites. They are the following: •

• •

WCAG-EM Web Content Accessibility Guidelines - Evaluation Methodology (W3C, 2014a): They provide a guide on how to evaluate the websites fulfilment of the Web Content Accessibility Guidelines (WCAG) 2.0. It describes a procedure to evaluate websites providing considerations in order to guide reviewers and to promote good practices. It is mainly designed to evaluate existing websites and it is suitable for different contexts of evaluation, including self-assessment and third-party assessment. First review of (W3C, 2013): It provides simple steps to help you assess the accessibility of a web page. List of tools (W3C, 2015b): It includes 69 automatic accessibility evaluation tools to date. This work has selected two of such tools for supporting the evaluation of the WCAG 2.0 standard and taking advantage of their valuable feedback through their constant updates (W3C, 2013). They are the following: o TAW: A tool that allows one to analyze, study and validate web pages in

ACCEPTED MANUSCRIPT order to make them accessible in two different frameworks: WCAG 1.0 and WCAG 2.0. o Checker: A tool that allows the free evaluation of the web accessibility of a site. This tool can use a wide variety of accessibility guidelines, such as WCAG 1.0 and WCAG 2.0. Complementing the evaluation of the WCAG 2.0, (Hilera, Fernández, Suárez, & Vilar, 2013) proposed a complementary scale of four levels of accessibility based on a percentage, which is established based on the results obtained from the compliance level of the accessibility criteria. The scale is summarized in Table 2.

RI PT

3.2 Measurement of web usability The evaluation of website usability will be always needed (Ferré, 2005), since there will be always something to improve that will facilitate the user usage. According to (ISO 13407, 1999), the evaluation of usability is used for

SC

1. Providing feedback to improve the design. 2. Assessing the user and organization goals to determine if they have been achieved. 3. Monitoring the usage of products or systems in the long term. The usability guide of the standard (ISO 9241-11, 1998) describes the benefits of measuring usability in terms of performance and user satisfaction. This measurement is performed by taking into account the achieved degree of the three usability measures: •

TE D

AC C



(Google PageSpeed, 2016): This tool measures the performance and the loading speed of a website for mobile and desktop devices. From the analysis of the website, the tool also provides a set of tips for improving the usability. (Google Analytics, 2016): This tool provides detailed reports and statistical information about the visits to a website, as well as the browser or the type of device that is connected to.

EP



M AN U

Effectiveness represents the accuracy with which users reach their goals. An example could be to measure the effectiveness of a search mechanism, i.e. the percentage of searches in which users really find the specific resource that they search in the website. • Efficiency represents the correlation between the consumed resources and the certainty with which users achieve their goals. In the example of the search mechanism, the efficiency is the effort and time that the user takes to find a specific resource in the website. • Satisfaction represents comfort and acceptance of use. In the previous example, satisfaction is measured in terms of how comfortable is the search for the user. There are automatic, free and online tools that allow the measurement of efficiency and effectiveness. These tools evaluate and obtain metrics such as loading speed, security degree, percentage of access and browsing in different browsers or devices, etc. From the existing tools, in this work, we have been used two of them:

Satisfaction is less quantifiable using tools because of its subjective nature. There are some proposals like (Agarwal & Venkatesh, 2002), (Seffah, Donyaee, Kline, & Padda, 2006), (Súarez, Martínez, Alvarez, & Alva, 2013)) suggest measuring it in terms of usability. Currently, the Sirius framework (Súarez et al., 2013) is one of the most complete proposals to measure usability based on heuristics, and by extension, to measure satisfaction. This framework measures 83 criteria of 10 aspects under evaluation in a quantitative and qualitative way. These aspects are: General Aspects (GA), Identify and Information (II), Structure and Navigation (SN), Labelling (LB), Layout of the Page (LY), Comprehension and easy Interaction (CI), Control and Feedback (CF), Multimedia Elements (ME), Search Elements (SE), and Help Elements (HE). Each aspect is composed of a set of measurable criterion (see Table 6 in (Rodriguez, Pérez, Cueva, & Torres, 2017). For example, the aspect General Aspect (GA) is composed by 10 measurable criteria related to general aspects of the website. Each criterion is measured from 0, when not compliant at all, and 10 when fully compliant. In addition to this quantitative measure (numerical value), the criterion is measured in a qualitative way labeling with NWS: Not compliant in the whole site (numerical value 0), NML: Not compliant in the mail links (numerical value 2,5), NHP: Not upheld in the home page

ACCEPTED MANUSCRIPT

SC

RI PT

(numerical value 5), NSP: Not compliant in one or more subpages (numerical value 7,5), YES: Fully compliant (numerical value 10), NA: Criterion not applicable in the site without numerical value (see Table 7 in (Rodriguez et al., 2017). For example, the criterion ‘GA10: Translation of the page is complete and correct’ of the Aspect ‘GA: General Aspect’ will be 10 and YES when all the pages of the website are properly and fully translated into the different languages that it provides; whereas it will be 7,5 and NSP when some pages of a submenu do not support all the languages. In addition, Sirius takes into account the type of website to prioritize the criteria, because depending on the type, the criticism of each one would be different. The weighting of criteria relevance is applied to the value for each criterion using the next scale: critical 8, major 4, moderate 2 and minor 1. Sirius defines 17 types of websites: Public Administration/Institutional, Online banking, Blog, E-Commerce, Communication/News, Personal, Service Portal, Webmail/Mail, Hybrid, Downloads, and Education/Training. Since this work is focused on OCW sites, it applies the Sirius weighting of criteria relevance that belongs to Education/Training websites. For example in Education/Training websites the criteria with critical weight (8) are ‘GA2: Content and services are precise and complete’ and ‘SE5: Simple and clear search system’, whereas these criteria in Blog websites differs, since GA2 has a moderate weight (2), and SE5 has the same weight as Education/Training websites, critical 8.

TE D

M AN U

The results obtained from the Sirius evaluation are presented with the list of critical usability criteria and the usability metric. This metric is a usability percentage that applies a Correction Factor: • Correction Factor: adjusting value that is applied to each of the criteria assessed in order to obtain different levels of usability depending on the relevance of them and the types of websites that have been evaluated. The resulting values from the application of this correction factor should be used to obtain the percentage of usability. The correction factor is calculated by dividing each value of relevance by the total of all the values of relevance for this type of website. The formula specifies:

Where fc is the correction factor to be applied to a criterion. rc is the relevance value of criterion.

EP

For example for the criterion ‘GA2: Content and services are precise and complete’ the relevance value is 8 (rc=8), and the sum of relevance criteria in the Education/Training websites is the following: 2 criteria are critical 2 *8 = 16, 21 criteria are major 21*4 = 84, 51 criteria are moderate 51*2 = 102 and 9 criteria are minor 9*1 = 9. Therefore, the sum is 16+84+102+9= 211. Thus: fci = 8 / 211 = 0,0379

AC C

(i)

Finally, the proposed formula for the determination of the percentage of usability of a website is the following:

Where PU is the percentage of usability fc is the correction factor applied to the evaluated criterion. nce is the number of the evaluated criteria, 83, its maximum value, i.e. the number of criteria established by this evaluation system. If it is considered that some of the criteria are not applicable to the website under evaluation, they will not be evaluated. vc is the evaluation value (between 0 and 10).

ACCEPTED MANUSCRIPT

Σ (fci*vci) = (0,038*7,3) + (0,038*9,2) = 0,627 Σ (fci*10) = (0,038*10) + (0,038*10= 0,76 PU = (0,627 / 0,76) * 100 = 82,5%

RI PT

For example if there are 6 users for evaluating the usability each one need fill the usability questionnaire (see Table 8 in (Rodriguez et al.,2017)) to calculate the PU. In this simple example, we assume that only 2 of the criteria were evaluated, GA2 and ‘SE5: Simple and clear search system’. Both of them have the same correction factor of 0,038 (see (i)), since they have the same critical weight (8) in Education/Training websites. The users valued GA2 with the values 5, 7, 8, 6, 9 and 9, whereas they valued SE5 with the values 8, 9,10, 9,10 and 9. Taken into account this data, the calculation of PU is the following vc_GA= (5+7+8+6+9+9) / 6 = 7,3 and, vc_SE= (8+9+10+9+10+9) / 6= 9,2 and nce= 2

SC

4. A framework and evaluation methodology for improving web accessibility and usability of OCW sites

TE D

M AN U

The framework and evaluation methodology form a complete solution to identifying usability and accessibility problems and they provide concrete solutions. Figure 2 shows the process of implementation of the framework through the methodology.

EP

Figure 2: Deployment process of the framework for improving web accessibility and usability of OCW sites through the defined methodology

4.1 Framework Definition

AC C

The defined framework for improving the web accessibility and usability of OCW sites allows one to identify possible problems of usability and accessibility, as well as their solutions or improvements. This framework has been defined on the basis of the existing standards (see Section 3). Specifically, the framework integrates two main components: (i) the accessibility principles of the WCAG 2.0 to ensure the accessibility of the site; and (ii) the usability measures proposed by the (ISO 9241-11, 1998) standard and Sirius (Súarez et al., 2013). As a result, this framework meets both requirements in its broader spectrum. In addition, the framework has been enriched by identifying specific actions for OCW sites (improvement solutions, see Table 3); these actions were derived from the experience of specific OCW assessments, for example the case study presented in this paper (see Section 5) (see Figure 3).

ACCEPTED MANUSCRIPT Accessibility

Usability

3 M easures 4 Principles 12 Guideliness 61 Success Criteria

Sirius

Accessibility and Usability Framework

RI PT

10 Aspects 83 Criterions

Figure 3: Sources from which the framework for improving the usability and accessibility of OCW sites has been defined

M AN U

SC

To evaluate accessibility, the framework uses the criteria, guidelines and principles of the WCAG version 2.0 (see section 3.1.), as well as recommendations for making web content more accessible for people with disabilities, including blindness, low vision, deafness and hearing loss, learning or cognitive limitations, limited movement, speech problems, photo sensitivity and combinations of these disabilities.

To evaluate usability, the framework uses the three usability measures of (ISO 9241-11, 1998) Part 11: Guidance of usability efficiency, effectiveness and satisfaction. To evaluate satisfaction, the framework uses Sirius (Súarez et al., 2013) (see section 3.2).

TE D

In addition, the defined framework includes problems that were identified in previous assessments of the usability and accessibility of OCW. It consists of a set of guidelines grouped into two parts: accessibility and usability. Due to the extension of the framework this paper presents the representative examples described in Tables 3 and 4 (see Tables 5 and 2 of (Rodriguez et al.,2017)). Table 3 shows an example of accessibility improvement for dealing with a possible error with regard to one of the 61 accessibility criteria by principle and level defined by the WCAG 2.0. Table 4 also presents an example of usability improvement due to an error in one of the 83 usability criteria organized by priority, attribute and aspects.

Table 3: Accessibility improvements for OCW sites Accessibility criteria WCAG 2.0

EP

PRINCIPLE Guide Accessibility WCAG 2.0

Possible errors

Accessibility level

Improvement solution

AC C

PERCEIVABLE: Information and user interface components must be submitted by the user so that he/she can perceive them 1.1 text alternatives: provide text alternatives for any non-text content, such as large print, Braille, language, symbols or simpler language

1.1.1 non-text content: all nontext content that is presented, with alternate text equivalent purpose.

Non-text content without alternative text within the contents of the OCW site A

-Include alternative to all non-text content text -Validate the OCW site using automatic tools for evaluation of accessibility

ACCEPTED MANUSCRIPT Table 4: Usability improvements for OCW sites Attribute ISO 914211

Aspects

Criteria

Satisfaction

General

AG2: Contents and services offered accurate and complete

CR

Possible error

Solution

The services available are not readily identifiable in the OCW site

The services of the OCW site are clearly identified.

RI PT

Aspects

Priority onsite educational training

SC

4.2 Methodology Definition The defined evaluation methodology aims to obtain a quantitative assessment (measurement) of the accessibility and usability of OCW sites. The definition of this methodology is based on the EM-WCAG 1.0 evaluation methodology of the Web Accessibility Initiative (WAI) (see section 3.1). One of the goals of this methodology is that it should have the same advantages as the WCAG-EM 1.0, but be specialized for OCW. Therefore, the defined methodology follows the steps defined by WCAG-EM, and some of these steps have been customized to the context of OCW sites. The methodology consists of 5 steps:

M AN U

Step 1: Define the scope and objective of the evaluation During this stage, the scope of the evaluation is defined, as well as the type of website. It is determined whether each website is within the evaluation scope. The evaluation is limited to sites that publish OCW resources. The goals of the evaluation are to determine (i) the level of accessibility (A, AA or AAA with AAA the level that guarantees a highest level of accessibility) and (ii) the level of usability by quantifying the efficiency, effectiveness and satisfaction aspects as established by (ISO 9241-11, 1998).

TE D

Step 2: Explore the target website In this step, the reviewer explores the target website for its evaluation. This exploration allows the reviewer to understand the website in terms of use, purpose and functionality. It includes identifying common web pages, kinds of pages, the essential functionality of the website, the web technologies involved, and considerations about accessibility.

AC C

EP

Step 3: Select a representative sample The reviewer selects a representative sample of the web pages from the OCW site under evaluation. This selection will conform to the evaluation objectives. This selection of a sample is necessary, since the evaluation of all web pages is usually not viable in extensive and complex websites because it would be too repetitive. This selection must be performed by ensuring that the sample is representative, and thereby the results of the evaluation reflect the accessibility and usability of the website with reasonable confidence. Some factors that influence the selection of the sample size are type, size, age, complexity, consistency and the website development process. The sampling process can be designed by selecting web pages considering some or all the factors specified in Step 2, or it can be performed by randomly selecting the web pages. Another factor to consider is whether to include part or all of the pages of the site. Step 4: Audit the sample The audit is carried out on the sample of web pages selected in Step 3. Two types of assessment for quantifying the accessibility and usability are performed: automatic and manual. These assessments provide a quantitative value, but also they should provide the plan with the solutions to the problems or errors that were identified. This plan will be generated based on the proposed framework for improving the usability and accessibility (see Section 4.1). They are described as follows: • •

Accessibility evaluation: 61 success accessibility criteria are considered. These criteria are proposed by the WCAG version 2.0. Usability evaluation: 3 ISO 9241 measures: efficiency, efficacy and satisfaction are considered and the 83 criteria of Sirius are used to assess their satisfaction.

ACCEPTED MANUSCRIPT To increase the accuracy of the results, the methodology applies methodological triangulation and data source release (Stake, 1995), i.e., different methods to measure the same concern and multiple data sources at potentially different occasions. Therefore, the methodology provides for different types of assessment (the OCW site must be evaluated both automatically and manually) and different sources (users with different levels of knowledge about the management of OCW).

RI PT

Step 5: Reporting evaluation results The documentation of the results for each of the above steps is essential to ensure the evaluation process in terms of transparency, replication of the results and their justification. The result of the OCW site evaluation following the framework will be both its quantification in terms of accessibility and usability, and the errors and problems with their possible solutions to increase the accessibility and usability of the OCW site.

5. Case study: Evaluation of the accessibility and usability of OCW of the UTPL

M AN U

SC

To validate the methodology defined for evaluating the accessibility and usability of OCW sites based on the framework defined, it is necessary apply it to a real OCW. As the study requires the measures to be performed in a natural context and to validate objectives with both qualitative and quantitative measures, a case study was conducted (Yin, 1994). Case studies allow for the study and research of contemporary phenomena in their natural context in order to look for evidence, increase knowledge or test theories using qualitative analysis (Runeson & Höst, 2009).

This case study involved the evaluation of the OCW site of Universidad Técnica Particular de Loja (UTPL) by applying the defined methodology and framework, which meant studying the accessibility and usability in a real context (Yin, 1994). This provides evidence of the ability of the defined methodology and framework in evaluating and suggesting the improvement of the usability and accessibility of OCW sites in terms of highlighting their shortcomings and errors and providing a set of solutions.

TE D

The findings from the case study are reported according to the guidelines for conducting and reporting case study research in software engineering by (Runeson, Host, Rainer, & Regnell, 2012). The goal of reporting on the case study is twofold: to communicate the findings of a study, and to function as a source of information for judging the quality of the study. With this twofold goal and the objective of obtaining feedback on the results to enrich the proposed framework and methodology, the reporting of the case study is described below.

EP

5.1 Research Objective and Questions

AC C

The research objective of this case study focuses on evaluating the effectiveness of the defined methodology and framework in identifying problems and improvements of accessibility and usability in a real-life setting. Hence, the research questions to be answered through the case study analysis are formulated as follows: RQ1: Is the evaluation methodology of accessibility and usability able to quantify the degree of accessibility and usability of an OCW site? RQ2: Is the evaluation methodology of accessibility and usability able to identify errors and improvements in the accessibility and usability of an OCW site? RQ3: Is the framework of accessibility and usability of OCW sites able to provide solutions or improvements to problems or deficiencies identified by the methodology? 5. 2 Data Collection Procedure Both quantitative and qualitative data were collected over three months. The collection methods are established by the methodology that it is being evaluated in the case study. Moreover, additional tools were used to synthetize the results: • Calculation tools: To synthesize and quantify the errors and improvements found by the methodology. • Storage tools: To store the quantitative results obtained by the tools that the

ACCEPTED MANUSCRIPT •

evaluation methodology uses. Evaluation forms: To report the errors and improvements detected by selecting them in the framework and their respective solutions.

5. 3 Analysis and Validity Procedure

RI PT

The analysis of the data obtained from the web tools are numeric data that are checked and stored by reviewers. The data extracted from the forms of yes or no are also stored for processing and synthesis. The data obtained are both numeric and Boolean (yes/no), so it is not necessary to introduce any technique of pre-processing information; the data obtained are automatically quantified.

SC

To increase the accuracy of the study, it is important to use the three existing types of triangulations (Stake, 1995). In this case study, the methodology that is being evaluated introduces the methodological and data source triangulation, since it uses different methods, standards and data sources to measure accessibility and usability. Whereas, the observer triangulation was used in the case study by using more than one observer in the case study (Van Heesch, Avgeriou, & Hilliard, 2012). In particular, three reviewers examine the results of the exploration of the OCW site and analyze the information retrieved. This allows one to compare results, avoid errors and discard tie cases where there are no discrepancies. 5. 4 Case Study Description

M AN U

The UTPL is a higher education institution in Ecuador. It has 42 years’ experience in onsite academic training of professionals and 38 years’ experience in distance learning. The UTPL is the pioneer in distance learning in Latin America with university centers (UTPL, 2016) all over Ecuador as well as international university centers in Bolivia, New York, Madrid and Rome. The UTPL has greatly encouraged distance learning, and its technological development in various initiatives has focused on the use of ICTs for supporting the teaching-learning process in distance learning.

AC C

EP

TE D

In 2010, the UTPL OCW initiative emerged with the purpose of providing free access to the materials of some of the courses it offers. The initiative is part of the network of Iberoamerican universities, (Universia, 2016), as well as the (Open Education Consortium, 2015) . The UTPL’s OCW site is implemented on (Educommons.com, 2016) an open source Content Management System CMS created specifically for open contents project (Open Education Consortium, 2016) (see Figure 4).

Figure 4: The UTPL OCW site

ACCEPTED MANUSCRIPT 5.5 Subject description The subjects of the case study are classified into two groups: the observers and the reviewers who participated in the case study. As outlined in section 5.3, there were three observers to ensure accurate triangulation. Table 5 describes the observers’ experience in the field and the role that they play in the case study. Age

Table 5: Characteristics of observers Sex Profession Experience

Observer 1

35

Female

Engineer in computer systems and computing

15 years

Observer 2

29

Male

A graduate of engineering in computer systems and computing

4 years

Observer 3

27

Female

A graduate of engineering in computer systems and computing

1 year

Assessment Accessibility and usability, manual and automatic

RI PT

Observer

Manual and automatic usability

SC

Manual and automatic usability

132

both (male and female)

Computing and Computer Systems

both (male and female)

Computing and Computer Systems

UTPL

UTPL

Assessment

All the courses (10 courses)

On site and open education

Two types (with and without knowledge of OCW resources)

Accessibility

8 (10 courses)

On site

With knowledge OCW resources

Usability of

EP

21

Table 6: Feature of the students who participated in the case study reviews Sex Studies Course Modality Experience

TE D

Nº people

M AN U

A total of 132 subjects participated in the case study as reviewers; they were students of the degree in Computer Systems and Computing at the UTPL. The students belonged to different courses and their knowledge of OCW resources varied. Table 6 summarizes the main features of the participant students.

5.6 Case Study Execution

AC C

This section describes how the methodology and the framework were applied to evaluate the accessibility and usability of the UTPL OCW site. 5.6.1 Scope and objective of the evaluation

The proposed framework was deployed to a case study to have a quantitative assessment of the accessibility and web usability of OCW sites, as well as to identify the major constraints of accessibility and usability. Once these constraints are provided and published to the community, any OCW site may solve them and having an immediate improvement in accessibility and usability. Therefore, any OCW site can take advantage of the framework. In particular, this case study evaluates the UTPL OCW. The evaluation of the UTPL OCW site has two main objectives: (i) the OCW site as a case study and (ii) the defined framework to improvement the accessibility and usability of OCW sites. Specifically, the evaluation consists in: •

Identifying the level of accessibility (A, AA or AAA) and the most frequent errors in terms of accessibility with regard to the defined criteria.

ACCEPTED MANUSCRIPT •

Quantifying the level of usability of the OCW site and identify errors with regard to the defined criteria.

5.6.2 Browsing the website

Browsing the website allowed for identification of the following issues: Common web pages: The web pages of the OCW sites are pages that correspond to courses: their structure is defined and only their content varies.

RI PT

Types of identified websites: OCW sites mostly consist of three types of pages: (i) home page (ii) degree page or category associated with the OCW, and (iii) the homepage of the course. Web technologies: The UTPL OCW site is implemented in Educommons 3.2.1. Educommons is designed under the (Zope, 2016) architecture, with a ZODB database, and based on the Plone content system (Plone, 2016), which allows interaction with the end user.

5.6.3 Representative sample

M AN U

SC

Main functionality: The main functionality of this kind of site is the one that the Content Management System (CMS) provides. In the case of the UTPL OCW, the CMS is implemented on Educommons, as it has been mention before. Educommons facilitates the creation and maintenance of OpenCourseWare projects (Moen, Lusk, Muwanguzi, & Siewert, 2009). In addition to many of (Plone, 2016) basic technical features, functions, and underlying structure, eduCommons provides additional support for importing preexisting materials from course management systems, accessibility, and standardized metadata.

TE D

The UTPL OCW site has approximately 150 pages including the main page. There are 8 internal pages (course information, content plan, learning guide, study materials, assessments, practices and exercises, bibliography and teaching team) for each of the 13 available courses. These courses are organized into 5 categories: • Economy: Algebra, calculus and statistics. • Continuing education: Basic computing. • Civil engineering: Applied and elementary topography. • Institute of pedagogy: Elaboration of didactic guides for open and distance learning. The distance learning: basics, theories and contributions. Printed materials for distance learning. • Systems software and computing: Computer basics, mathematical foundations, foundations of programming and discrete mathematics.

EP

5.6.4 Audit sample and results

To perform the audit of the sample two types of evaluation of accessibility were performed, automatic and manual evaluations. They are described as follows.

AC C

5.6.4.1 Automatic evaluation

This evaluation aims to obtain a first approximation of the level of accessibility and usability of the site. It is done using available web tools, which evaluate sites with regard to accessibility and usability standards or regulations. These tools provide errors and warnings of accessibility with their description, as well as suggestions to resolve them. The automatic evaluation of accessibility, as the methodology establishes, was performed by using the tools, TAW and A-Checker. These tools evaluate the WCAG 2.0 standard to automatically assess the accessibility; and therefore, the methodology has more than one measure for comparison. Each tool was executed by selecting the assessment option WCAG 2.0. Figure 5 shows the error percentage that identified the TAW tool for each level of accessibility. The assessment with the tool A-checker was discarded since the identified errors were too low; in fact, when the tool was analysed, these represented 0%.

ACCEPTED MANUSCRIPT 12% 10% 10% 8% 8% 6% 6% 4%

0% Level A

Level AA

Level AAA

RI PT

2%

Figure 5: Percentage of accessibility errors identified by TAW for each level

M AN U

SC

The automatic evaluation of usability was also performed using the two tools that establish the methodology: Google PageSpeed and Google Analytics. The Google PageSpeed results reveal that the UTPL OCW site has a load speed of 68% in PCs and 56% in mobiles. In addition, it provides a set of errors categorized by priority and a suggested solution for each of them (see Table 3 in (Rodriguez et al., 2017)). For example, it identified that in the UTPL OCW site the response of the user’s server did not include explicit cache headers, and that the resources were stored in cache for a short period of time, and the improvement was necessary. On the other hand, the Google Analytics tool provides other kinds of information such as the percentage of access to the OCW site reported from different browsers, the type of device or the city from which the access is established, etc. (see Table 4 in (Rodriguez et al., 2017)). The results of the year 2016 reveal that the highest percentage of visits to the UTPL OCW are form Ecuador with a 32,12%, as it is the country of the site, and the most used browser and operative system are Chrome (73,59%) and Windows (63,97%), respectively. In addition, in the case of mobile devices, the most used are Apple devices (40,37%) and the more used operative system is Android (56,63%).

TE D

5.6.4.2 Manual evaluation

EP

Manual evaluation aims to quantify accessibility and usability through the interaction of the user with the user interface of the OCW site using the established criteria. In this case study, the sample of users is extracted by registering the user’s experience as he/she uses the UTPL’s OCW site. This evaluation is conducted with a set of usability accessibility-oriented tests that users answer when using the OCW site (see Table 8 and 9 in (Rodriguez et al., 2017)).

AC C

The manual evaluation of accessibility takes the WCAG 2.0 standard to obtain the percentage of accessibility errors found for each accessibility level A, AA and AAA. In addition, it also identifies accessibility errors that validate or supplement the errors reported by the automatic evaluation. The selected population are UTPL’s Degree of Information Systems and Computing on-site students. Triola’s formula is used to calculate the sample size (Triola, 2012).

Where Variable

Description

Value

The sample size

To be defined

Population size (number of UTPL’s Degree of Information Systems and Computing on-site students in the period AprilAugust)

280

n N

ACCEPTED MANUSCRIPT σ Z

e

Standard deviation of the population. When it is not defined explicitly, the constant value is 0.55

0.5

Value obtained by using confidence intervals. If the value is missing, it is obtained by using a value with regard to the 95% confidence

1.96

Acceptable limit of the sample error. If there is no value, the value used varies between 1% (0.01) and 9% (0.09).

0.05

TE D

M AN U

SC

RI PT

Applying the formula, the sample size was 132 real users. Two types of users were identified: users with and without knowledge of OCW resources. However, this difference was transparent for users. The accessibility test used considers the 61 criteria from WCAG 2.0 categorized by each of the 4 principles and priority. They were valued using the following scale: • High: Criterion with a high percentage of compliance • Medium: Criterion with a moderate percentage of compliance • Low: There is no evidence of this criterion, i.e., it is not fulfilled. The results were obtained by adding the checks that users performed on each criterion: 61 criteria and 132 users evaluating the OCW site registered 8,052 checks corresponding to 100%, where the 40% placed high, 49% medium and 11% low as shown in Figure 6.

Figure 6: Percentage of errors using the accessibility scale

AC C

6%

EP

In order to obtain comparable parameters with those obtained from the automated assessment, we focused on the percentage obtained with the low-level, since the considered problems at this level are those accessibility criteria that are not met or are absent. They are shown by the level of accessibility in Figure 7. In addition, the manual evaluation allowed us to identify the accessibility problems that require a solution from the defined framework. 5%

5% 4%

3%

3%

3% 2% 1% 0%

Level A

Level AA

Level AAA

Figure 7: Problems detected in the manual assessment classified by accessibility level

Manual usability evaluation focuses on complementing the automatic evaluation of usability made and focused on effectiveness and efficiency by applying the

ACCEPTED MANUSCRIPT

Table 7: PU of the UTPL OCW Percentage of usability (PU)

ASPECTS

64%

SC

General Aspects (GA) Identify and Information (II)

82,56%

Structure and Navigation (SN)

77,82%

Layout of the Page (LY)

71,94%

Control and Feedback (CF)

54,76%

Search Elements (SE)

Average

90%

82.56%

74.71%

80%

AC C

40%

78,58%

36,72%

69,44%

77.82% 79.93%

78.58% 73.42%

71.94%

EP

50%

64%

73,42%

TE D

Help Elements (HE)

60%

79,93%

Comprehension and easy Interaction (CI)

Multimedia Elements (ME)

70%

74,71%

M AN U

Labelling (LB)

RI PT

satisfaction evaluation of Sirius (Súarez et al., 2013) (see Section 3.2) applying a questionnaire (see Table 9 in (Rodriguez et al., 2017)). The population of users was again the on-site Computer Systems and Computing students. Since Sirius is an evaluation framework based on the revision of heuristics by experts, the sample used was 21 students in the latter levels of the degree, specifically those in the eighth (of ten) cycle. The 10 aspects and 83 criteria defined by Sirius were considered. The usability percentage of the site is obtained from the average of the results obtained in each questionnaire according to the procedure of Sirius. The usability percentage of the UTPL OCW site was 69.44%. In addition to processing each instrument, the percentage of each usability aspect defined by Sirius (see Table 6 in (Rodriguez et al., 2017)) was calculated. The obtained results are shown in Table 7 and Figure 8.

54.76%

36.72%

30% 20% 10% 0%

GA

II

SN

LB

LY

CI

CF

ME

SE

HE

Figure 8: Percentage of each usability aspect

5.7. Analysis of Results The analysis and discussion of the results obtained in the case study are analyzed on the basis of the research questions.

ACCEPTED MANUSCRIPT RQ1: Is the evaluation methodology of accessibility and usability able to quantify the degree of accessibility and usability of an OCW site? The results of the case study show that the methodology allows one to obtain a measurable result of the accessibility and usability of an OCW site, and, in this particular case of the OCW of the UTPL, from different perspectives or points of view. Specifically, the result allows one to evaluate accessibility based on the levels of accessibility defined by the WAI and the principles of WCAG 2.0 standard.

Automatic Evaluation

Manual Evaluation

Average

10% 8% 6%

5% 3%

Level A

Level AA

5%

M AN U

3%

SC

8% 6%

RI PT

The accessibility levels A, AA, and AAA defined by the WAI are globally accepted. The results obtained for the OCW of the UTPL in the automatic and manual evaluations and their averages are outlined in Figure 9.

Level AAA

Figure 9: Percentage of errors by accessibility level

EP

TE D

From the results obtained (see Figure 9), it is important to emphasize the following concerns: • The average number of errors found in each level is quantified with a N%; that means that the UTPL’s OCW site complies with the 100 – N% of the checkpoints with priorities 1, 2 and 3 (does not allow access, highly difficult to access and slightly hampers access). • The highest percentage of problems is located at the AAA level, the highest level of accessibility. • The adding of the accessibility averages by level is 19. • The results obtained from both the automatic and manual evaluations are comparable, which demonstrates consistency. The accessibility measurements by the principles defined in the WCAG 2.0 standard are synthesized in Table 8 and Figure 10 below. Table 8: Number and percentage of accessibility problems encountered by each principle of accessibility Number of associated criteria

Number of problems found

Percentage of problems found

Perceivable

22

7

31.81%

Operable

20

4

20%

Understandable

17

3

17.64%

Robust

2

1

50%

61

15

24.59%

AC C

ACCESSIBILITY WCAG 2.0 PRINCIPLES

Average

ACCEPTED MANUSCRIPT Number of asociated criterions 22

Number of errors found

20 17

7

Perceptible

Operable

3

Understandable

2

1

Robust

RI PT

4

Figure 10: Number of associated criteria and number of errors found for each principle of the WCAG 2.0

M AN U

SC

From the data presented in Table 8, it is possible to conclude that: • The largest number of errors is associated with the robust and perceivable principles, which represent 50% and 31.81% of errors respectively. • The lowest percentage of errors is associated with the operable and understandable principles with 20% and 17.64% of errors respectively. • With a total of 61 accessibility criteria evaluated, 15 found errors represent 24.59% of problems of accessibility. As a result, 65.41% is accessible, that means a MODERATE level of accessibility following (Hilera et al., 2013) scale, i.e. an acceptable number of access limitations to the content or functionality of the OCW site. Usability aspects based on the evaluation manual, which values the satisfaction level using the Sirius, provides a quantitative assessment in percentages of the usability aspects (see Table 7 and Figure 8). From this data, it is important to emphasize the following concerns:

The average usability level is 69.44%. (Hilera et al., 2013) scale has been adopted for measuring the usability; thereby, this percentage of usability is at the MODERATE level, i.e. an acceptable number of access limitations to the content or functionality of the OCW site.



The usability percentage is 69.44%. As a result, it is possible to conclude that there is a level of 30.56% usability problems.



The aspects with highest percentages of usability are the identification and information (82.56%), the layout (79.93%) and search (78.58%).



The aspects with the lowest percentages of usability are the assistance (36.72%), control and feedback (54.76%) and general aspects (64%).

EP

TE D



AC C

RQ2: Is the evaluation methodology of accessibility and usability able to identify errors and improvements in the accessibility and usability of an OCW site? The methodology identifies specific errors to be dealt with in terms of accessibility and usability in both automatic and manual evaluation. The accessibility results obtained in the case study are shown in Table 9. In addition, Table 10 shows usability errors identified in the case study OCW UTPL. Table 9: Accessibility errors identified in the case study OCW UTPL Principle Criteria Level Problem Perceivable

1.1.1 Non-textual content: all nontextual content that is presented has an alternative text with an equivalent purpose.

A

Images with no text alternatives Images with suspicious "alt" Decorative images with title attribute Consecutive text links

ACCEPTED MANUSCRIPT A

Required form controls

1.4.6 Contrast (enhanced): the visual presentation of text and images of text has a contrast ratio of 7:1 with the exception of large, incidental text or logos.

AAA

Visual presentation

Operable

2.4.9 link purpose (link only): there should be no links with the same text that link to different places

AAA

Links without content

Understandable

3.3.6 Error prevention: in places where the user ships information, control options such as undo, verify or confirm must be provided.

AAA

Identify incorrect values in form

SC

RI PT

3.3.1. Error identification: If an input error is detected, the wrong item is automatically identified and described to the user by means of text

Table 10: Usability errors identified in the case study OCW UTPL Aspect

Satisfaction

Criterion

General Aspects

VERY

II7: Provides information about the author, sources and creation dates, and revisions of the document in articles, news and reports

HIGH

Do not place information such as author, date of creation and updating of content on the OCW site

and

EN2: Proper organization and browsing structure

VERY

The structure of the content and browsing indexes, in hierarchical form, is not suitable on the OCW site.

EP

Structure browsing

AC C

Understandable and ease of interaction

Search

Help

Problem

AG2: Provides content and services that are accurate and complete

TE D

Identity and Information

Priority

M AN U

Measure

EF1: Clear and concise language is used.

BU1: If it is necessary, it is accessible from all the pages of the site.

HIGH

HIGH

VERY HIGH

VERY HIGH

BU5: Simple and understandable search mechanism.

AY2: Easy access and return to the help mechanism

The services available are not easily identifiable in the OCW site

The used language on the OCW site is too technical or complex, which makes it to read. The search option for the OCW site is not accessible from all pages of the OCW site. The search mechanisms of the OCW site are not simple and understandable for the user.

HIGH

Absence mechanism.

of

help

ACCEPTED MANUSCRIPT RQ3: Is the framework of accessibility and usability of OCW sites able to provide fixes or improvements to problems or deficiencies identified by the methodology? The framework includes solutions to each accessibility and usability error categorized by attributes, aspects and criteria defined by the selected standards (see Tables 1 and 2, in (Rodriguez et al., 2017)). 5.8 Discussion and Limitations

RI PT

We obtained evidence of the viability of the evaluation methodology of accessibility and usability of OCW and its framework of deployment. This work has quantified the accessibility and usability of the website OCW of the UTPL. It has also identified deficiencies and improvements of the OCW website in terms of accessibility and usability. Finally, a solution for each of the problems has been provided by the framework. As a result, the evaluation methodology and framework enable the improvement of the accessibility and usability of the UTPL OCW website.

M AN U

SC

Case studies are qualitative in nature. For this reason, collected data from case studies are usually very difficult to objectively judge (Yin, 2008). To improve the internal validity of the results presented, the independent variables that could influence this case study have been identified as follows: The student's experience has great influence; its influence has been reduced by considering both students with and without OCW experience. However, the influence of OCW size, complexity and the accessible usable elements cannot be reduced due to the inherent nature of case studies, which normally focus on one project. To improve the internal validity of the results, the triangulation of source data and methodology has been performed (see Section 5.3).

TE D

Construct validity is concerned with the procedure of collecting data and with obtaining the right measures for concept studies, which has been overcome by using standards. However, the major limitation in case study research concerns external validity, i.e., the generality of the results with respect to a specific population (Van Heesch et al., 2012), as only one case is studied. In return, case studies allow one to evaluate a phenomenon, model, or process in a real setting. This is important when striving to create solutions that can be applied to real problems in which a multitude of external factors may affect the results. Other techniques, such as validation of formal experiments, while permitting replication and generalization, cannot be considered as they are conducted under controlled settings. Reliability is concerned with replication; in case studies, this means the same results would be found if the analysis was redone. Therefore, this methodology and framework could be applied to other OCW by reproducing this case study.

6. Conclusions and future work

AC C

EP

This paper takes a step toward promoting the use of OCW and the distribution of free knowledge at the university level by providing a framework to improve the accessibility and usability of OCW sites. This framework is presented as a solution to increase the level of access and use of OCW. In addition, this work presents a methodology of evaluation of OCW sites that allows one to determine the degree of usability and accessibility of a site as well as to identify shortcomings. This research has been defined from OER and web accessibility and usability-related standards, and the existing work found in the literature. It has defined a formal framework, taking a step forward in the area. Specifically, the defined framework to improve the accessibility and usability of OCW sites integrates the most relevant related works and standards: the Web Content Accessibility Guidelines WCAG 2.0 (W3C, 2008), the Usability Guide (ISO, 1998), and the usability aspects for educational sites set out by Sirius (Súarez et al., 2013). The defined methodology and framework have been deployed to assess the accessibility and usability of a case study, the UTPL OCW site. This allowed for the assessment of the quantitative level of accessibility and usability of the case study, as well as identifying the most frequent errors and their possible solutions in both the qualitative and quantitative aspects. This allowed us to improve the accessibility and usability of the UTPL OCW site, which evidences that the methodology and the framework can help in the improvement of the accessibility and usability of OCW sites. The case study also demonstrated that the WAI proposed by the W3C (W3C, 2014) can be adaptive and adapted to specific settings,

ACCEPTED MANUSCRIPT such as OCW sites, with some adjustments or modifications. Finally, it is important to emphasize that the obtained results are important feedback to enhance and improve the proposed framework, so it will be incrementally improved and refined with successive evaluations of OCW sites.

RI PT

It is therefore possible to conclude that this paper presents an integrated and complete framework for improving the accessibility and usability of OCW sites that can be applied immediately. This work opens up a wide variety of future research on (i) implementing and validating the framework with respect to the real increase in access and use to OCW, once the suggested improvements of the framework have been performed; (ii) applying the methodology of evaluation of usability and accessibility to other OCW sites in order to obtain more qualitative and quantitative inputs that support the framework; (iii) creating a free distribution tool that facilitates users in the deployment of the methodology and framework, and (iv) including quality features in the framework to improve the quality of sites OCW; therefore, other features associated with the quality of software applications should be added such as Functionality, Reliability, Efficiency, Maintainability and Portability (ISO, 2000).

SC

7. Bibliography

M AN U

Agarwal, R., & Venkatesh, V. (2002). Assessing a Firm's Web Presence: A Heuristic Evaluation Procedure for the Measurement of Usability. Journal Information Systems Research, 168-186. Brajnik, G. (2008). A comparative test of web accessibility evaluation methods. Assets '08 Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility (pp. 113-120). New York: ACM. doi:10.1145/1414471.1414494

TE D

Çakmak, T., Özel, N., & Yılmaz, M. (2013, July 4). Evaluation of the Open Course Ware Initiatives within the Scope of Digital Literacy Skills: Turkish Open CourseWare Consortium Case. Procedia - Social and Behavioral Sciences: 2nd World Conference on Educational Technology Researches – WCETR2012, 83, 65-70. doi:10.1016/j.sbspro.2013.06.014

EP

Camilleri, A., Ehlers, U., & Pawlowski, J. (2014). State of the art review of quality issues related to open educational resources. Publications Office of the European Union JRC Scientific and Policy Reports. Retrieved from http://is.jrc.ec.europa.eu/pages/EAP/documents/201405JRC88304.pdf

AC C

Caswell, T., Henson, S., Jensen, M., & Wiley, D. (2008). Open Educational Resources: Enabling Universal Education. International Review of Research in Open and Distance Learning, 9(1). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/469/1001 Coursera. (2016). Coursera. Retrieved from Coursera: https://www.coursera.org/ Creative Commons. (2016). Creative Commons. Retrieved from Creative Commons: https://creativecommons.org/ Dubois, J. (2012). Usabilidad en redes sociales. Master Thesis, Universidad Internacional de la Rioja, Madrid. Retrieved 10 06, 2015, from http://reunir.unir.net/handle/123456789/51 Educause. (2016). Educause. Retrieved from Educause: https://library.educause.edu/

ACCEPTED MANUSCRIPT Educommons.com. (2016). http://educommons.com/

Educommons.

Retrieved

from

Educommons:

Ferré, X. (2005). Integration Framework usability in the software development process. Universidad Politécnica de Madrid. Madrid: UPM. Retrieved from http://oa.upm.es/440/1/XAVIER_FERRE_GRAU.PDF

RI PT

Gómez, S., Callaghan, L., Eick, S. C., Carson, S., & Andersson, H. (2012). An institutional approach to supporting open education: A case study of OpenCourseWare at Massachusetts Institute of Technology. Proceedings of Cambridge 2012: Innovation and Impact – Openly Collaborating to Enhance Education. Cambridge: The Open University. Retrieved from https://goo.gl/UQ9Rth

SC

Google Analytics. (2016). Google Analytics. Retrieved from Google Analytics: https://www.google.com/intl/es/analytics/

M AN U

Google PageSpeed. (2016). Google PageSpeed. Retrieved from Google PageSpeed: https://developers.google.com/speed/pagespeed/ Hilera, J., Fernández, L., Suárez, E., & Vilar, E. (2013, January). Evaluation of accessibility of web pages Spanish and foreign universities included in international university rankings. Spanish Journal of Scientific Documentation, 36(1), 1-16. doi: http://dx.doi.org/10.3989/redc.2013.1.913

TE D

Hodgins. (2002). The future of learning objects. The Instructional Use of Learning Objects. IEEE. (2005). The Learning Object Metadata Standard. Retrieved from Institute of Electrical and Electronics Engineers: http://ieeeltsc.org/wg12LOM/lomDescription from

ISO:

EP

ISO 13407. (1999, June 10). ISO. Retrieved http://www.iso.org/iso/catalogue_detail.htm?csnumber=21197

AC C

ISO. (2000). Information technology—Software product quality — Part 1: Quality model. Retrieved from International Standard Organization: http://www.cse.unsw.edu.au/~cs3710/PMmaterials/Resources/9126-1%20Standard.pdf ISO 9241-11. (1998). ISO. Retrieved from ISO: http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=168 83 ISO 9241-171. (2008). ISO. Retrieved from ISO: http://www.iso.org/iso/iso_catalogue/catalogue_ics/catalogue_detail_ics.htm?csnumbe r=39080 Jung, I., Sasaki, T., & Latchem, C. (2016). A framework for assessing fitness for purpose in open educational resources. Retrieved from https://www.surfspace.nl/media/bijlagen/artikel-697ee18ac0f1441bb158e6122818f5f589e.pdf

ACCEPTED MANUSCRIPT Martínez, S. (2014). OCW (OpenCourseWare) and MOOC (Open Course Where?). OCWC Global Conference Open Education for a Multicultural World (pp. 2-5). Ljubljana: OCWC. Retrieved from http://conference.oeconsortium.org/2014/wpcontent/uploads/2014/02/Paper_16.pdf

MIT. (2016). MITOpenCourseWare. http://ocw.mit.edu/index.htm

Retrieved

Retrieved

from

from

Miriadax.net:

MITOpenCourseWare:

RI PT

Miriadax. (2015). Miriadax.net. https://miriadax.net/cursos

SC

Moen, W., Lusk, C., Muwanguzi, S., & Siewert, J. (2009). Open Source Software for Use in Learning Object Repositories. A Review and Assessment. Texas: Texas Center for Digital Knowledge. Retrieved from http://www.oncoreblueprint.org/_doc/LOR_OSS_Report_10Dec2009.pdf

M AN U

OEC. (2016). Open Education Consortium. Retrieved from Open Education Consortium: Open Education Consortium Open Education Consortium. (2015). Open Education Consortium. Retrieved from Open Education Consortium: http://www.oeconsortium.org/ Open Education Consortium. (2016). Faculty: Creating OERs. Retrieved from Open Education Consortium: http://www.oeconsortium.org/info-center/topic/creating-oers/

TE D

Plone. (2016). Plone. Retrieved from Plone: PLONE http://plone.org/ Pollak, L. (2008, March). Should higher education course materials be free to all? Public Policy Research, 15(1), 36-41. doi:10.1111/j.1744-540X.2008.00506.x

EP

Polsani, P. (2003). Use and Abuse of Reusable Learning Objects. Texas Digital Library, 3(4), 1-5. Retrieved from https://journals.tdl.org/jodi/index.php/jodi/article/view/89/88

AC C

Rhoads, R., Berdan, J., & Toven-Lindsey, B. (2013, February). The Open Courseware Movement in Higher Education: Unmasking Power and Raising Questions about the Movement's Democratic Potential. Educational Theory, 63(1), 87-110. doi:DOI: 10.1111/edth.12011 Rodríguez, G., & Cueva, S. (2010). OER, Standars and Trends. Universities and knowledge Society Journal, 7(1), 1-8. Retrieved from http://rusc.uoc.edu/index.php/rusc/article/view/v7n1_cueva_rodriguez/v7n1_cueva_ro driguez Rodriguez, G., Pérez, J., Cueva, S., & Torres, R. (2017). Accessibility and Usability OCW Data: The UTPL OCW. Rojas, M., & Montilva, J. (2011). Software Architecture to integrate Learning Objects based on Web Services. Ninth LACCEI Latin American and Caribbean Conference(LACCEI’2011), Engineering for a Smart Planet, Innovation, Information Technology and Computational Tools for Sustainable Development,, (pp. 1-10). Medellín.

ACCEPTED MANUSCRIPT Romero, R. (2001). Usabilidad y accesibilidad, dos enfoques complementarios. Retrieved from Unidad de Investigación ACCESO: http://acceso.uv.es/accesibilidad/artics/01usabilidad-accesibilidad.htm Runeson, P., & Höst, M. (2009, April). Guidelines for conducting and reporting case study research in software engineering. Journal, 14(2), 131-164. doi:10.1007/s10664-0089102-8

RI PT

Runeson, P., Host, M., Rainer, A., & Regnell, B. (2012). Case study research in software engineering: guidelines and examples. Hoboken, New Jersey: John Wiley & Sons, Inc. doi:10.1002/9781118181034

SC

Seffah, A., Donyaee, M., Kline, R., & Padda, H. (2006). Usability measurement and metrics: A consolidated model. Software Quality Journal, 159–178.

Stake, R. (1995). The art of case study research. (pp. 49-68). Thousand Oaks: Sage Publications.

M AN U

Súarez, M. C., Martínez, A. B., Alvarez, D., & Alva, M. E. (2013, March). Sirius: A heuristicbased framework for measuring web usability adapted to the type of website. Journal of Systems and Software, 86(3), 649-663. doi:doi:10.1016/j.jss.2012.10.049 Terrell, R., & Caudill, J. (2012, January 1). OpenCourseWare: open sharing of course content and design. Journal of Computing Sciences in Colleges, 27(3), 38-42.

TE D

Triola, M. (2012). Elementary Statistics (12° ed.). United States of America: Pearson. UNESCO. (2002). Forum on the Impact of Open Courseware for Higher Education in Developing Countries Final Report. Paris: UNESCO. Retrieved from http://unesdoc.unesco.org/images/0012/001285/128515e.pdf

EP

UNESCO. (2012). Paris OER Declaration. Retrieved from Paris OER Declaration: http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/WPFD2009/English_Declar ation.html Universia.

Retrieved

from

OCW

Universia:

AC C

Universia. (2016). OCW http://ocw.universia.net/

UTPL. (2016). University Centres UTPL. Retrieved from University Centres UTPL: http://www.utpl.edu.ec/centros_utpl/ Van Heesch, U., Avgeriou, P., & Hilliard, R. (2012). A documentation framework for architecture decisions. Journal of Systems and Software, 85(4), 795–820. doi:10.1016/j.jss.2011.10.017 Varlamis, I., & Apostolakis, I. (2006). The Present and Future of Standards for E-Learning Technologies. (A. Koohang, Ed.) Interdisciplinary Journal of Knowledge and Learning Objects, 2, 59-76. Retrieved from http://ijklo.org/Volume2/v2p059-076Varlamis.pdf

ACCEPTED MANUSCRIPT Vladoiu, M. (2011). Open courseware initiatives - after 10 years. RoEduNet International Conference 10th Edition: Networking in Education and Research (pp. 1-6). Iasi: IEEE. doi:10.1109/RoEduNet.2011.5993712 W3C. (2002, December 17). User Agent Accessibility Guidelines 1.0. Retrieved from W3C Recommendation: https://www.w3.org/TR/UAAG10/

RI PT

W3C. (2008). Web Content Accessibility Guidelines (WCAG) 2.0. Retrieved from W3C Recommendation: https://www.w3.org/TR/WCAG20/

W3C. (2013). Easy Checks - A First Review of Web Accessibility. Retrieved from Web Accessibility Initiative: https://www.w3.org/WAI/eval/preliminary.html

SC

W3C. (2014a, July 10). Website Accessibility Conformance Evaluation Methodology (WCAG-EM) 1.0. Retrieved from W3C Recommendation: https://www.w3.org/TR/WCAG-EM/

M AN U

W3C. (2014b, March 20). Accessible Rich Internet Applications (WAI-ARIA) 1.0. Retrieved from W3C Recommendation: https://www.w3.org/TR/wai-aria/ W3C. (2015a, September 24). Authoring Tool Accessibility Guidelines (ATAG) 2.0. Retrieved from W3C Recommendation: https://www.w3.org/TR/ATAG20/ W3C. (2015b). Web Accessibility Evaluation Tools List. Retrieved from Web Accessibility Evaluation Tools List: https://www.w3.org/WAI/ER/tools/ Retrieved

from

W3C:

TE D

W3C. (2016a). W3C Mission. https://www.w3.org/Consortium/mission

W3C. (2016b, October 07). Web Accessibility Initiative (WAI). Retrieved from Web Accesibility: https://www.w3.org/WAI/

EP

William and Flora Hewlett Foundation. (2016). Open Educational Resources. Retrieved from William and Flora Hewlett Foundation: http://www.hewlett.org/strategy/openeducational-resources/

AC C

Yin, R. (1994). Case Study Research: Design and Methods (Applied Social Research Methods) (2° ed.). (S. Publications, Ed.) Thousand Oaks, California, United States of America. Yin, R. (2008). Case study research. Design and methods, (4° ed.). Thousand Oaks, California, United States of America: SAGE Publication. Zope. (2016). Zope. Retrieved from Zope: http://www.zope.org/

ACCEPTED MANUSCRIPT

A Framework for improving web accessibility and usability of Open Course Ware sites

• •

AC C

EP

TE D

M AN U

SC



Quantification of accessibility and usability of websites that offer Open CourseWare Integration of web accessibility and usability standards to the domain Open Educational Resources such as Open Courseware . Proposal and implementation of a methodology for assessing accessibility and usability in OER such as Open CourseWare sites Proposal and implementation of a framework to increase the accessibility and usability OER such as Open CourseWare sites.

RI PT



ACCEPTED MANUSCRIPT •

AC C

EP

TE D

M AN U

SC

RI PT



Secretaría de Educación Superior, Ciencia, Tecnología e Innovación del Ecuador Universidad Técnica Particular de Loja, Ecuador