Government Information Quarterly 31 (2014) 476–487
Contents lists available at ScienceDirect
Government Information Quarterly journal homepage: www.elsevier.com/locate/govinf
Revisiting Alabama state website accessibility Norman E. Youngblood 1 School of Communication & Journalism, 217 Tichenor, Auburn University, AL 36849, USA
a r t i c l e
i n f o
Available online 30 July 2014 Keywords: Accessibility E-government State websites Section 508 WCAG 2.0
a b s t r a c t Potter's (2002) accessibility review of over 60 Alabama state-level websites was designed to establish a baseline for monitoring the state government's progress on online accessibility. The study found significant room for improvement. Only 20% of the reviewed sites met Section 508 requirements, and only 19% of the sites met WAI Priority 1 accessibility standards, based on a combination of automated evaluation and manual inspection of the code. In 2006, Alabama adopted ITS 1210-00S2: Universal Accessibility, which offered basic guidelines to assist developers in complying with Section 508 requirements. The current study revisits the state home pages that Potter evaluated to see how accessibility levels have changed over the years, particularly with the state's adoption of ITS-530S2. Like Potter, the current analysis is based on a combination of automated testing and a manual review of each page's HTML. The study found that compliance has not improved substantially since Potter's analysis and reinforces the idea that the presence of a standard does not correlate with compliance. © 2014 Elsevier Inc. All rights reserved.
1. Introduction As the World Wide Web rose in importance as an e-government tool in the 1990s, the federal government took steps to ensure that citizens and government employees had access to electronic-based government information, regardless of disabilities. Section 508 of the Rehabilitation Act (29 U.S.C. § 794d) addresses a range of e-government accessibility issues, including providing specific guidelines for online information and applications (§ 1194.22). Although Section 508 guidelines were designed specifically for federal agencies, the guidelines have the potential to be applied to some state and local governments, depending on the provisions of federal funds that the entities might receive. With calls for accessible e-government on the rise, state governments soon began crafting their own regulations, often directly incorporating Section 508 guidelines (Jaeger, 2004). Not all states took this route, however. The State of Alabama's Information Technology Standard 530S2-00: Universal Accessibility (ITS-530S2) is designed to “advise agencies on the use of the minimum requirements for online accessibility for all State of Alabama web sites that comply with Section 508.” Rather than listing the full Section 508 provisions, ITS 530S2 provides six basic requirements for helping developers ensure compliance, including how to appropriately craft quality hyperlinks and image alternative attributes, and calling for developers to test sites on multiple browsers and to avoid using frames. The standard mandates compliance for most “Executive Branch agencies, boards and commissions,” particularly those using the alabama.gov and state.al.us domain names
1
E-mail address:
[email protected]. Fax: +1 334 844 4573.
http://dx.doi.org/10.1016/j.giq.2014.02.007 0740-624X/© 2014 Elsevier Inc. All rights reserved.
(State of Alabama, 2011a,b). As this study found, however, adoption of accessibility guidelines does not necessarily equate to adherence. Almost a decade has passed since Potter's (2002) 2003 accessibility review of over 60 Alabama state-level websites (the dates here are confusing as the study was technically published in a 2002 issue, but data collection was in 2003). Designed to establish a baseline for monitoring the state government's progress on online accessibility, the study found that although accessibility seemed to have improved since West (2002), state-level websites had tremendous room for improvement. Only 20% of the reviewed sites met Section 508 guidelines, and only 19% of the sites met WAI Priority 1 accessibility standards. Potter's study appeared at a critical time in e-government. Federal web accessibility standards, defined by Section 508 of the Rehabilitation Act Amendments of 1973, had only been codified since 1998, and many states, Alabama included, had no state-level accessibility mandate, relying instead on individual departments to make policies (Potter, 2002). In 2006, Alabama adopted ITS 1210-00S2: Universal Accessibility, renamed ITS 530S2-00 in 2011. Two years after the standard was in place, West (2008) found that accessibility problems were still endemic in state agency websites. Although West ranked Alabama's state-level egovernment services eighth nationally, only 10% of the Alabama statelevel sites passed an automated accessibility test. The average among the states was 19%. The current study builds on West and Potter by revisiting the state homepages that Potter evaluated in 2003 to see how their accessibility levels have changed over the years, particularly with the state's adoption of ITS-530S2. Like Potter, the current analysis is based on a combination of automated testing and a manual review of each page's HTML, including checking for the use of appropriate image alternative attributes and the phrasing of linked text. Each
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
homepage was evaluated to see if it complied with three, often overlapping, standards: the World Wide Web Consortium's WCAG 2.0, the federal government's Section 508 guidelines, and Alabama's ITS-530S2. 2. Literature review 2.1. Web accessibility For a website to be “accessible,” content needs to be available to users regardless of disability. The W3C calls for designers to take into account a range of potential disabilities when designing a website. These disabilities include visual and auditory impairment, mobility limitations, speech impairment, cognitive limitations, and learning disabilities (W3C, 2008). The W3C also argues that accessibility is a critical element of the World Wide Web and that the web “is fundamentally designed to work for all people, whatever their hardware, software, language, culture, location, or physical or mental ability” (W3C, 2010). Tim Berners-Lee, architect of the World Wide Web and current W3C director, underscores the importance of accessibility and the ability of the web to empower disabled users, arguing, “the power of the Web is in its universality. Access by everyone is an essential aspect” (W3C, 1997). The United Nations echoed this sentiment in The Convention on the Rights of Persons With Disabilities. Adopted in 2006, the convention specifically calls for signatory nations to “promote access for persons with disabilities to new information and communications technologies and systems, including the Internet” (United Nations, 2006). The treaty received wide support internationally, not only being the most quickly negotiated human rights treaty to date, but also garnering the most signatures on the first day that it was able to be signed (United Nations, 2007). 2.2. Accessibility guidelines and legislation Vanderheiden (1995) argued that the rise of graphics-based web browsers, such as Mosaic, raised issues for users with disabilities, particularly those with vision problems and, along with others, offered recommendations to assist web developers in making content more accessible. Many of these early guidelines, however, focused on offering general advice, such as telling designers to make sure that they “use sufficient contrast” without providing a definition for how much contrast is needed to be used (Vanderheiden, 2009). Researchers at the University of Wisconsin's Trace Research and Development Center eventually incorporated a number of these early guidelines into the Unified Web Site Accessibility Guidelines (Vanderheiden & Chisholm, 1998), which was in turn used as a starting point for the W3C's Web Content Accessibility Guidelines Working Group, responsible for developing the W3C's Web Content Accessibility Guidelines—WCAG 1.0 (W3C, 1999). The W3C (1999) divided checkpoints into three priority levels: • Priority 1: A web content developer must satisfy this checkpoint. Otherwise, one or more groups will find it impossible to access information in the document. Satisfying this checkpoint is a basic requirement for some groups to be able to use web documents. • Priority 2: A web content developer should satisfy this checkpoint. Otherwise, one or more groups will find it difficult to access information in the document. Satisfying this checkpoint will remove significant barriers to accessing web documents. • Priority 3: A web content developer may address this checkpoint. Otherwise, one or more groups will find it somewhat difficult to access information in the document. Satisfying this checkpoint will improve access to web documents. Meeting all Priority 1 checkpoints yields Level A conformance, meeting Priorities 1 and 2 gives a site Level AA conformance, and meeting all three priorities gives a site Level AAA conformance (W3C, 1999). WCAG 1.0 proved to have broad influence on e-government, with a number of countries, including the United States, using it as a basis for
477
their own accessibility guidelines (Donker-Kuijer, de Jong, & Lentz, 2010). The W3C introduced revised standards, WCAG 2.0, in 2008. While the new standards are largely backward compatible with the older standard, the new standards attempt to move past specifying requirements for HTML, to addressing accessibility issues across a wider range of web-related technologies. In creating the new standards, the W3C (2008) focused on website design meeting four basic accessibility principles. • Principle 1: Perceivable—Information and user interface components must be presentable to users in ways that they can perceive. • Principle 2: Operable—User interface components and navigation must be operable. • Principle 3: Understandable—Information and the operation of user interface must be understandable. • Principle 4: Robust—Content must be robust enough that it can be interpreted reliably by a wide variety of user agents, including assistive technologies. In the process, they moved towards creating more specific and testable processes. As an example, WCAG 1.0, Guideline 1.1 specifies that designers need to “provide a text equivalent for every non-text element” and offers examples of non-text items, including images, animations, sounds, and videos. The WCAG 2.0 discussion covers three guidelines and provides a description of what appropriate text alternatives might be for each medium (W3C, 2009). The new guidelines measure conformance similarly to the original WCAG guidelines, though the W3C has replaced “priorities” with “levels” in the actual descriptions of the guidelines, with Priority 1 being replaced by “Level A Success Criteria,” etc. Conformance has been slightly expanded and now includes either the webpage satisfying Level A Success Criteria or providing a “conforming alternate version” (W3C, 2008). Li, Yen, Lu, and Lin (2012) found that, assuming that a site already meets WCAG 1.0 guidelines, migrating existing e-government sites to conform to the new guidelines required only minor modifications to the design. Section 508 § 1194.22 was signed into law in 1998 and the resulting regulations went into effect in 2001. The regulations are based in large part on WCAG 1.0 standards (Olalere & Lazar, 2011). In theory, the Office of the Attorney General is supposed to report on federal agency compliance with the regulations on a biyearly basis, however the Department of Justice did not collect that data between 2004 and 2010, and of the 100 U.S. federal websites Olalere and Lazar (2011) visited, 90% had Section 508 compliance issues. The most recent Department of Justice (2012) report on federal agency Section 508 compliance as of FY2010, reinforces those findings, reporting, among other things, that only 67% of the agencies surveyed had an established process to ensure that those responsible for web content followed Section 508 guidelines, and that only 57.5% conducted regular website accessibility evaluation and remediation. While most agencies (82.4%) reported passing an audit of appropriate use of ALT attributes, video and multimedia seem to have received less attention, with 26.4% having a formal multimedia/video accessibility policy and 24.2% having no plans to develop such a policy (Department of Justice, 2012). The United States Access Board, a federal accessibility agency responsible for coordinating federal accessibility policy and representing the disabled public (U.S. Access Board, n.d.), is in the process of revising Section 508 standards to match WCAG 2.0. A 2011 draft of the revised standards makes frequent reference to WCAG 2.0, including requiring most federal agency web-based communication to “conform to Level A and Level AA Success Criteria and Conformance Requirements specified for web pages in WCAG 2.0” (U.S. Access Board, 2011). Based on the planned changes, using the newer standards in evaluating existing government websites should help set the stage for future studies. One of the possible reasons for non-compliance that Olalere and Lazar (2011) suggest is that neither the Department of Justice nor the Access Board had issued “clear guidelines … on what steps to take to make a website accessible.” Some state agencies, responsible for developing
478
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
online accessibility policies, including Alabama's, have attempted to address this issue by providing concise guidelines for meeting at least minimal accessibility standards, particularly as the federal government has put increased pressure on state and local governments to make their web presence accessible. Although the federal government bases its website accessibility on conformance to Section 508 guidelines, Jaeger (2004) points out that a number of other laws and regulations address e-government website accessibility. These include Section 504 of the Rehabilitation Act, the E-government Act, the Telecommunications Act of 1996, and the Americans with Disabilities Act (ADA). More recently, the federal government has moved to promote the captioning of online video with the 21st Century Communications and Video Accessibility Act (CCVA). The 2010 revision of Title II of the ADA, which covers the accessibility obligations of public entities including the state and the local government, explicitly incorporates internet-based products. Title II argues that, “Today, the Internet plays a critical role in daily life for personal, civic, commercial, and business purposes,” and that, “Although the language of the ADA does not explicitly mention the Internet, the Department has taken the position that Title II covers Internet Web site access” (Americans with Disabilities Act, Title II, pt.35 § 35.190, 2010). Title II applies to the state and the local government, regardless of its receipt of federal funds, and the DOJ is in the process of developing web accessibility guidelines. Current ADA rules allow public entities, such as state and local governments, to meet website accessibility requirements through non-internet-based services, such as staffed phone lines, but require that these services be equal to those found on the web, including the hours of operation (Americans with Disabilities Act, Title II, pt.35 § 35.190, 2010). While all states have accessibility mandates (Georgia Tech Research Institute, 2009), these mandates vary by state, and stronger policies tend to lead to better accessibility, though stronger policies are not a guarantee of accessibility (Rubaii-Barrett & Wise, 2008). Many state mandates, including Alabama's, reference Section 508 and/or WCAG standards. Some states, such as Rhode Island (State of Rhode Island, 2013), Vermont (State of Vermont, 2013), and Alabama, provide basic checklists to help developers meet these guidelines. Alabama's accessibility standard, AITS-540S2, argues that it is important to “make government information accessible to all,” including those with “visual, physical, or developmental disabilities,” and that much like “environmental obstacles” in the physical world, the internet “can present obstacles” to access. In an effort to address this, AITS 540S2-00 provides a sixpoint guideline to help state agencies “on the use of the minimum requirements for online accessibility for all State of Alabama web sites that comply with Section 508” (State of Alabama, 2011a,b). These requirements cover most executive branch agencies, however, they exclude the judicial and legislative branches, as well as most educational institutions and the education television commission (State of Alabama, n.d.a). AITS 540S2-00 requirements include the appropriate use of ALT attributes for images, the use of descriptive text for hyperlinks, offering alternatives to online forms, avoiding frames, and testing websites in multiple browsers. While by no means covering all of Section 508, these guidelines address at least some of the major concerns in making sure that a website is accessible and point developers to both Section 508 and WCAG 2.0 resources, including DOJ accessibility enforcement information (State of Alabama, 2011a,b). Findings by Yu and Parmanto (2011), however, suggest that requiring strict adherence to Section 508 may be a more effective means of increasing compliance than more limited state-specified guidelines. 2.3. E-government in Alabama In 2001, the state of Alabama entered into a contract with Alabama Interactive, a subsidiary of NIC, to develop a state web portal, Alabama.gov, and to assist with e-government services such as financial transactions, systems integration, security, and web hosting (State of
Alabama, eGovernment Initiative). The following year, the state set down a framework for approaching e-government, noting that it “sees e-government as no longer just a good idea, but a necessity” (State of Alabama, 2002a, p. 1). The framework called for the adoption of site development standards, particularly in the area of information architecture, that would facilitate the exchange of information between state agencies and provide a “consistent approach” for users conducting business with the state (State of Alabama, 2002a, p. 2). As part of this framework, the state also recognized that e-government services, done correctly, offer a number of advantages for the state including increasing government efficiency, improving customer service, and improving the state's overall image. Alabama Interactive's contract with the state included working in many of these areas. A review of company's Network Manager Reports to the state from 2002 to 2013 match these goals, as the group's main focus outside of the Alabama.gov website has been helping state agencies with web-based applications, such as e-commerce and online applications for professional licensing, rather than developing agency websites (e.g., Alabama Interactive, 2002, 2005, 2009, 2013). Launched in 2002, Alabama.gov (Fig. 1) initially relied on mirrored text-only versions of the site to ensure site accessibility. Early iterations included a page listing the accessibility status of the text version of the site on a section-by-section basis. The report (retrieved from the Internet Archive) from October 14, 2002 (see Fig. 2) indicated that the text-only version of the site had passed automated Section 508 accessibility review using Bobby and that all but one section (which was under review) met WCAG 1.0 standards (State of Alabama, 2002b). Alabama.gov continued to maintain a text-only version of the site until mid-2011 (State of Alabama, 2011a, 2011b). Alabama e-government in general received low marks through most of West's 2001–2008 studies on the quality of e-government in the United States at the state and federal levels, which included both state portal websites such as Alabama.gov and individual state agency websites. As shown in Table 1, West's studies on federal and state e-government (2001, 2002, 2003, 2004, 2005, 2006, 2007) repeatedly found Alabama to be almost at the bottom of state-level egovernment rankings, including several years at next-to-last. In many cases, Alabama state websites did not follow best practices, such as offering site translation services and privacy and security statements. Accessibility was particularly problematic for Alabama sites, with only 6% passing an automated accessibility test in 2001 (West, 2001). With one exception (2006), the number of Alabama sites passing the automated accessibility test did not break 30% between 2001 and 2007 (West, 2002; West, 2003a, 2003b; West, 2004; West, 2005; West, 2006; West, 2007). Between 2007 and 2008, however, Alabama egovernment in general improved tremendously. In 2008, the final year of West's studies, Alabama ranked eighth nationally. At the same time, however, only 10% of the state's websites passed an automated accessibility test that year, compared to a national average of 19% for state websites. When taking a longitudinal view of West's accessibility figures, it is important to note that in 2008, West switched accessibility tools. Watchfire's Bobby, used in previous studies, was no longer available and West replaced it with WebAIM's WAVE. A review of the 2007 and 2008 data reveals that the percentage of state and federal websites passing the automated test dropped by over 50% between 2007 and 2008 (West, 2007; West, 2008), suggesting that differences between Bobby and WAVE may account for the dropping pass rate, rather than a change in coding practices. The most recent national study (Yu & Parmanto, 2011) included an automated analysis of 721 pages from the alabama.gov domain, comprised of pages from the state web portal (www.alabama.gov) as well as state agencies using the alabama.gov domain such as ema.alabama.gov and medicaid.alabama.gov. The study found a per-page average of 1.98 Priority 1 errors, 1.2 Priority 2 errors, and 3.39 Priority 3 errors based on WCAG 1.0 standards, placing Alabama 45th nationally in terms of accessibility (Yu & Parmanto, 2011).
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
479
Fig. 1. Alabama.gov, circa October 2002.
2.4. Disabilities in the United States and Alabama Accessibility is an issue for a relatively large portion of the population in both the United States as a whole and for Alabama specifically. Based on the 2008 American Community Survey data (U.S. Census Bureau, n.d.), at the national level, around 19% of noninstitutionalized Americans have some sort of disability. Americans over 65 are among the most likely to have a disability, with 38% of members of that age group reporting a disability. State level statistics in Alabama are relatively similar, around 16% overall, with 44% of the over 65 age group reporting one or more disabilities. Vision problems affect over 3% of the Alabama population, and as one might expect, that number rises to 9% among those 65 and older, a group that is also more likely to have cognitive disorders (13%) than the general population (6%). It is worth noting that in many cases, groups that fall into the have-not portion of the digital divide, such as the elderly and some minority groups, often have higher disability rates than the overall population, meaning that accessibility problems may compound existing digital divide issues.
discernable factors that correlated to accessibility, dismissing several likely elements including population size, per-capita income, and percentage of the population with a disability. The only factor that seemed to make a difference was the presence of a strongly worded accessibility mandate with clear guidelines. Even this, they point out, did not guarantee accessibility, in part because the mandates usually lacked an enforcement element. Lazar et al. (2013) point out that there is a tremendous need for longitudinal studies of state-level website accessibility. In their longitudinal study of Maryland state-level sites, they found that agencies that adopted a state template that had been vetted for accessibility typically saw an improvement in accessibility, and often a tremendous one. As discussed above, while Alabama e-government in general has improved dramatically since 2007 (West, 2008), accessibility remains an issue, suggesting that Alabama website accessibility may not have improved significantly since Potter's (2002) findings despite the publishing of state-level guidelines to promote website accessibility. While there are clearly shortcomings in e-government website accessibility, several researchers (e.g., Hackett, Parmanto, & Zeng, 2004; Hanson & Richards, 2013) have found that e-government sites perform better than many commercial sites and that, in general, e-government accessibility has improved, despite the growing complexity of the sites.
2.5. E-government accessibility research 2.6. Accessibility methodology While establishing accessibility regulations and guidelines provides metrics for helping assess accessibility, these actions by no means guarantee that government websites will be accessible. As noted above, Olalere and Lazar (2011) found that 90% of the United States federal agency websites that they examined failed to meet Section 508 requirements and that “laws that require accessibility and regulations that explain what interface features must be present” have not resulted in increased compliance. These findings echo earlier studies of federal websites by West (2002, 2006, 2008) and others (e.g., Jaeger, 2006; Loiacono, McCoy, & Chin, 2005). Researchers have also found extensive accessibility issues in state-level websites in both comparative studies (e.g., Fagan & Fagan, 2004; Goette, Collier, & White, 2006; West, 2002, 2006, 2008; Yu & Parmanto, 2011) and in studies of individual states (e.g., Lazar et al., 2010), including Potter's (2002) study of Alabama state-level websites. State-level website accessibility has fared better than did federal level sites in some cases (Yu & Parmanto, 2011). Finding ways to improve state-level accessibility has been complicated. Drawing on West (2004), Rubaii-Barrett and Wise (2008) found few easily
Researchers have typically approached evaluating website accessibility by using one or more of the following strategies: automated testing, heuristic evaluation, expert evaluation, user testing, policy analysis, or web-manager questionnaires. Each method offers insights into different aspects of website accessibility, ranging from testing the level of accessibility, to legal implications, to studies such as those by Lazar, Dudely-Sponaugle, and Greenidge (2004) and Farrelly (2011), which examine why designers choose to include or exclude accessibility from their designs. Jaeger (2006), in particular, argues strongly for evaluation using multiple approaches. Many early e-government accessibility studies relied strictly on automated testing, which involves submitting the website to an online or locally-based testing tool that then applies accessibility heuristics, such as WCAG 1.0 and 2.0 and Section 508, to the HTML code to test for common accessibility problems. These problems can include missing ALT attributes, poor color combinations, poorly or unlabeled form elements, and incorrect tab order. Automated testing is also common in e-government studies in
480
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
Fig. 2. Alabama.gov accessibility information, circa October 2002.
WAVE (West, 2008; Youngblood & Mackiewicz, 2012). While Bobby and WAVE are based largely on WCAG 1.0 standards, Bobby was particularly useful as it generated reports outlining WCAG 1.0 and Section 508-related issues, something that WAVE does not do. There are a number of alternatives to WAVE, but few offer evaluations based on WCAG 2.0 standards. AChecker (ATRC, 2009) is one of
which accessibility is only a portion of what is being studied, such as studies examining levels of e-government, website features, and overall website usability (West, 2002, 2006, 2008; Youngblood & Mackiewicz, 2012). Many early studies used Watchfire's Bobby (e.g., Fagan & Fagan, 2004; Potter, 2002; West, 2002, 2003a,b, 2006); once it became unavailable, researchers turned to other options, particularly WebAIM's Table 1 Rankings of Alabama e-government and accessibility scores from West (2001–2008). Year
e-gov score (1–100)
e-government rank
% AL sites passing Bobby/WAVE
% of sites passing Bobby/WAVE for top state
2001 2002 2003 2004 2005 2006 2007 2008
33 35.8 31.9 29.9 31.9 28.4 37.2 66.4
48 (tied) 48 46 44 48 49 45 8
6 5 16 25 29 35 21 35a
60 92 86 91 97 97 100 64a
a
Accessibility tool changes from Bobby to WAVE in 2008.
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
the exceptions. An open source tool funded in part by the Government of Ontario, AChecker can validate code against a number of standards, including WCAG 1.0 and 2.0 and Section 508, and points the user to particular sections of code that need to be manually evaluated. In addition, AChecker offers the option of validating HTML and CSS (ATRC, n.d.; Gay & Li, 2010). While a relative newcomer to accessibility evaluation, AChecker has seen use by both e-government (Fuglerud & Røssvoll, 2012) and e-commerce (Gilbertson & Machin, 2012) researchers. As several researchers (e.g., Jaeger, 2006; Olalere & Lazar, 2011; Harper & Chen, 2012) point out, though, automated testing comes with some decided limitations. As an example, automated testing can tell that a designer has included the ALT attribute within an image element; however, it cannot tell whether the ALT attribute information is actually useful, and it will treat “image 29” as a valid ALT attribute. Automated software also cannot advise on compliance for some standards, such as flicker rate and the use of captions in audio and video. More troubling, as Harper and Chen (2012) point out, two automated tools may not return the same number of errors for an individual page. To address these problems, several researchers (e.g., Potter, 2002; Olalere & Lazar, 2011; Fuglerud & Røssvoll, 2012) argue for including expert evaluation as part of the process, and combine automated testing with expert evaluation in which a knowledgeable coder examines the HTML code for problems, particularly the appropriate use of ALT attributes. Jaeger (2006) argues for including disabled users in the testing process, as they are more likely to be able to find problems, similar to standard user-based usability testing. While this technique offers advantages, locating appropriate users can be complicated, and the technique is time-consuming, particularly for large samples. 3. Methodology 3.1. Website sample Because the goal of this study was to determine how well Alabama state-level websites had responded to the state's accessibility mandate, the website selection was based upon the sites that Potter (2002) evaluated nine years earlier—his data was collected in 2003. Using the list in Potter, Appendix A, the researcher verified the website addresses and updated them as needed, using a combination of the state web portal and Google. Usable addresses were available for all but three sites from Potter's study. The exceptions were the Garrett Coliseum, the Alabama State Capital Police, and Alabama's Aerospace Attractions. The first two of these appear to have been rolled into other websites. Data collection occurred over a four-day period in May 2012. 3.2. Evaluation methodology One of the challenges in this study is that Potter (2002) and many other early scholars looking at website accessibility relied on the website analysis tool Bobby, which is no longer readily available for researchers. That said, the core data in Potter (2002) consisted of testing for WCAG 1.0 and Section 508 compliance, a task that can be completed using other tools such as AChecker. For this study, the researcher used AChecker to evaluate the sites based on the most recent W3C guidelines, WCAG 2.0, and on Section 508 § 1194.22 guidelines. AChecker returns three levels of problems in its results: known problem, likely problems, and potential problems. For this study, the researcher only recorded issues listed as known problems, which AChecker describes as “problems that have been identified with certainty as accessibility barriers.” For WCAG 2.0, the researcher recorded the number of known problems identified for A, AA, and AAA level guidelines. In the case of Section 508, the researcher recorded the number of known problems identified for standards A–K from § 1194.22. As noted in the literature review, automated tools, such AChecker, are a good starting point for evaluation but are far from foolproof. Therefore, the researcher also conducted a visual inspection of each page. Like
481
Potter (2002), the researcher examined the HTML for each page to see if included useable ALT attributes or if a page provided nebulous ALT text. For example, did the header image for the Alabama Department of Veterans Affairs' website use ALT text that made sense for its header image, such as “Department of Veterans Affairs,” or something that provides no useful information such as “image98.jpg” or “header image.” As in Potter (2002), the code for each page was also examined to see if images and image maps included usable ALT attributes all of the time, some of the time, or none of the time. The researcher also relied on visual inspection to check to see if the homepage met the following requirements quoted below from Alabama Standard 530S2-00: • Every graphic image shall have an “alt” tag and a short description that is intuitive to the user. If a graphic image is used as a navigation element, it shall contain a text description and direction that is intuitive to the user. Decorative graphics, such as bullets, shall be set with an “alt” tag of bemptyN as to not impede screen readers. • For every graphic element that uses an image map, alternative text of the hyperlink shall be provided. • The site shall have descriptive, intuitive text links and avoid the use of vague references such as “click,” “here,” “link,” or “this.” • The use of frames shall be avoided since they cannot be read intelligently by screen readers, create navigation problems, and are not supported by all browsers. • An alternative form of access shall be made available for online forms, such as an e-mail address or phone number. In each case, the page was coded as having passed or failed the requirement. One Alabama Standard 5302-00 requirement was not tested in this study as it was outside of the study's focus on accessibility: • Multiple browser testing shall be conducted on current versions of Firefox, Netscape Navigator, Internet Explorer, Safari and Lynx. The researcher conducted the visual inspection by viewing each page in Firefox 12.0 on an Apple computer running OS 10.74 and visually inspected the resulting HTML code using Firefox's “View Source” option. Each state agency home page was evaluated to see if the page met Alabama ITS-530S2. This included checking for the use of frames, usable image ALT attributes, usable ALT attributes for image map information, and usable linked text. The researcher also examined each page to see if there was a form on the page as well as to see whether there was contact information or a link to contact information on the main page. As the focus of this study was on accessibility, the researcher did not test each page in multiple browsers. The researcher coded the data from the WCAG 2.0, Section 508, and ITS-530S2 evaluations into Excel for analysis and comparison with Potter's results. Intercoder reliability, based on a 10% sample, yielded a Cohen's Kappa of .96. 4. Results Appendix A outlines the changes in website accessibility between Potter's (2002) findings and the current study's findings in 2012 on a site-by-site basis. Table 2 provides a comparison of overall site conformance changes between 2003 and 2012. These results come with two Table 2 Website accessibility 2003 v. 2012. Sites conforming to standard in 2012a (2003)b Level
A
AA
AAA
508
AITSc
Number Percent
5 (12) 8 (19)
16 (1) 27 (2)
36 (0) 60 (0)
13 (10) 22 (16)
4 (na) 7 (na)
a
2012 WCAG 2.0 and 508 data based on AChecker (n = 60). Percentages are rounded. 2003 WCAG 1.0 and 508 data based on Bobby (n = 64) (Potter, 2002). Percentages are rounded. c AITS data based on a manual evaluation of the code. b
482
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
important caveats. First, as Bobby is no longer readily available, AChecker was used in this study, and prior studies suggest that accessibility results often vary between tools (Harper & Chen, 2012) and the changes in the accessibility pass rates between West (2007) and West (2008) suggest that newer tools may be more sensitive to accessibility issues. Second, part of Potter's evaluation relied on WCAG 1.0 standards, while the current study relies on the newer WCAG 2.0 standard. Section 508 guidelines, however, remain unchanged. Overall, adherence to Section 508 standards has improved slightly, with AChecker returning 13 sites with no Section 508 standard violations, versus Bobby's nine sites in 2003. In the case of a comparison of 2012 sites with no WCAG 2.0 A-level violations versus 2003 sites with no WCAG 1.0 Priority 1, current home pages performed worse using the new standards, with only five sites having no WCAG 2.0 A-level violations in 2012 and versus 11 sites with no WCAG Priority 1 level violations in 2003. Table 3 compares the most common WCAG 1.0 violations from 2003 with the most common WCAG 2.0 violations from 2012. More troubling, only four sites, the Office of the Attorney General, the Alabama Crime Victims Compensation Commission, the Governor's Office on National and Community Service, and the Alabama Electrical Contractors Board, passed a manual evaluation of the first four requirements of Alabama ITS-530S2; and only one site, the Office of the Attorney General, passed both the automated evaluation of WCAG 2.0 and Section 508, and the manual evaluation of AITS 530S2 standards. Even the state's primary website, Alabama.gov, had some instances of missing ALT attributes. Some sites had numerous issues, despite having a visual appealing site. As an example, AChecker reported the Alabama Forestry Commission website as violating three Section 508 standards and having seven Priority A WCAG 2.0 standards, including having missing ALT attributes for some 40 images. While some of these images were transparent spacer images, some of the images served as hyperlinks. Fig. 3 illustrates some these problems and shows the Alabama Forestry Commission website with a sample of the results returned by AChecker. 4.1. Section 508 standards The most commonly violated Section 508 standard, based on the number of sites with one or more violations, was Standard A, which mandates a text equivalent for each non-text element. In all, AChecker identified 41 sites with alternative text problems, totaling 372 instances, with an average of 9.07 violations per identified page. AChecker identified violations of Standard L—“When pages utilize scripting languages to display content, or to create interface elements, the information provided by the script shall be identified with functional text that can be read by assistive technology”—as occurring in 32 sites, with an average of 3.49 violations per identified site (111 violations in all). The third most common identified error was for Standard N—“When
electronic forms are designed to be completed on-line, the form shall allow people using assistive technology to access the information, field elements, and functionality required for completion and submission of the form, including all directions and cues”—occurring in 45 sites, with an error average of 2.2 per site. 4.2. WCAG 2.0 Guidelines The most common WCAG 2.0 Level A Conformance error that AChecker identified was of Guideline 3.1.1: “The default human language of each Web page can be programmatically determined.” This guideline mandates that the language (English, Spanish, Chinese, etc.) be identified in the HTML, usually in the HTML element, such as bhtml lang =“en”N, identifying that the page is in English. Of the 60 sites examined, 85% violated this guideline. The second most common error was a violation of Guideline 1.1.1: “All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for the situations listed below.” The numbers here match the corresponding Section 508, Standard A: 41 sites, or 63.33%. The third and fourth most common errors were violations of Guideline 1.3.1, “Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text” (26.67% of sites) and of Guideline 3.3.2, “Labels or instructions are provided when content requires user input” (25%). AChecker-identified Level AA Conformance errors primarily were problems with resizable text, though a few sites had problems with headings and labels, as well as text color to background color contrast ratios. 81.67% of sites violated Guideline 1.4.4: “Except for captions and images of text, text can be resized without assistive technology up to 200% without loss of content or functionality.” The next two most common WCAG 2.0 AA issues were Guideline 2.4.6— “Headings and labels describe topic or purpose”—(10% of sites) and Guideline 1.4.3—“The visual presentation of text and images of text has a contrast ratio of at least 4.5:1, except for the following”—(5% of sites). AChecker identified only two AAA conformance issues: Guideline 1.4.6—“The visual presentation of text and images of text has a contrast ratio of at least 7:1, except for the following”—(38.33%) and Guideline 1.2.9—“An alternative for time-based media that presents equivalent information for live audio-only content is provided”—which AChecker only identified one site as violating (1.67%). 4.3. AITS 5302-00 standards Given the high percentage of sites that AChecker identified as having ALT attribute issues, it is not surprising that the vast majority of sites failed to meet the AITS 5302-00 standard mandating the appropriate use of image ALT attributes. Only six sites passed a manual inspection
Table 3 Total sites for Level A errors by type 2003 v. 2012. 2003 Level A errors (WCAG 1.0)a
Sites
2012 Level A errors (WCAG 2.0)b
Sites
Provide alternative text for all images Give each frame a title Provide alternative text for each applet Provide alternative text for all image map hot-spots Provide alternative text for all image-type buttons in forms
55 4 6 9 1
3.1.1 Language of pagec 1.1.1 Non-text content (alt text) 1.3.1 Info and relationships (sequencing) 3.3.2 Labels or instructions: Labels or instructions are provided when content requires user input. 4.1.1 Parsing: Proper nesting of HTML elements and use of unique CSS IDs. 2.1.1 Keyboard: Site operation does not require a mouse. 1.4.1 Use of color: Color is not the only means of conveying information 2.4.2 Page titled: Web page has a descriptive title. 2.4.4 Link purpose (in context): The purpose of links can be determined by the link text. 2.2.1 Timing adjustable: No timed functions or opportunity to adjust timing. 2.2.2 Pause, stop, hide: Gives the user the ability to control automated content 2.4.1 Bypass blocks: The ability to bypass blocks of content repeated on multiple pages to get to the primary page content.
51 40 16 15 12 11 10 6 6 1 1 1
a b c
2012 WCAG 2.0 and 508 data based on AChecker (n = 60). 2003 WCAG 1.0 and 508 data based on Bobby (n = 64) (Potter, 2002). This was a Level AAA error under WCAG 1.0.
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
483
Fig. 3. Sample of AChecker results based on the Alabama Forestry Commission website.
of image ALT attributes, and 90% of the sites had one or more problems with image ALT attributes. Using descriptive text in text links also posed a problem for a number of sites, with 23.33% of sites using unclear phrasing such as “click here” as linked text. The use of ALT attributes for image maps faired substantially better, however, with only one site, the Alabama Law Institute, missing ALT attributes for an image map. All but one site, the Alabama Legislative Information System, followed the mandate to avoid frames. State sites also performed well on the standard of providing alternate contact information when forms are present. Forty sites had a form of some sort on the main page, typically a search box or a sign up form for email lists. Half (20) of these pages had an email address or phone number on the main page, while the other half (20) had a link to contact information (such as “contact us” or “directory”) on the main page. Of the 60 sites examined, 58 had contact information available, with 27 sites having contact information on the main page and 31 with a link to contact information. Only two sites did not have contact information or a link to contact information directly on the main page. 5. Discussion Despite the enactment of AITS 5302-00 since Potter's (2002) study, accessibility of Alabama state-level websites has seen only limited improvement, particularly in terms of the number of sites passing Section 508 compliance. Of the websites that Potter (2002) analyzed that were revisited for this study, Bobby identified 80% as not passing Section 508 standards. Almost ten-years later, using AChecker, 78.33% of Alabama state-level sites still failed to pass an automated review of
Section 508 standards. These sites also appear even less ready to meet the likely change in these standards as 92% failed to pass an automated review of WCAG 2.0 guidelines, compared to an 81% failure rate of the WCAG 1.0 Priority 1 guidelines found by Potter (2002). Particularly problematic sites in terms of the number of standard/guidelines violated include The Department of Environmental Management, which AChecker identified as violating four WCAG 2.0 A-level guidelines and four Section 508 standards; the Alabama Guard (five A-level guidelines and three Section 508 standards) and the Forestry Commission (seven A-level guidelines and three Section 508 standards). Perhaps more troubling, however, were some of the more egregious failures to address the most basic standards, such as the use of ALT attributes. As examples, Alabama's Department of Homeland Security had over 80 images missing ALT attributes, the Forestry Board was missing ALT attributes for 40 images, and the Onsite Wastewater Board for almost 30 images. A manual review of the code revealed that in 13 homepages, the designers either omitted ALT attributes entirely or used them incorrectly, in some cases using ALT=“” for linked images or using filenames for ALT information. These sites included the Peace Officer Annuity Fund, the Department of Veterans Affairs, the Board of Examiners in Marriage and Family Therapy, the Children's Trust Fund of Alabama, The Alabama Law Institute, and Office of the State Auditor. While in some cases, the missing ALT attributes may not keep the site from being usable, such as the missing ALT attribute for a spacer image, in other instances, the missing information was for images that contained text and linked images. In the example of the Alabama Forestry Commission shown in Fig. 3, the graphic links for obtaining a burn
484
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
permit, the American Recovery and Reinvestment Act, Facebook, YouTube, and the link to ask questions, are all missing ALT attributes. The links are not readily available elsewhere on the page, though there are text-based links at the top of the page to FAQs and contacting the agency. There are a number of other accessibility problems that may not be readily apparent on the page. The picture on the grass underneath “Welcome” is part of a Flash-based navigation system, and Flash can be problematic for blind users. In addition, the page is based in part on nested tables, a design technique known to cause significant problems for text-readers. The end result is that, while a blind user might eventually find the information he or she is looking for, the process may not be that simple. Many of the accessibility problems identified in this study can be fixed relatively easily and do not require redesigns of the site, for example adding appropriate ALT attributes for images and adding a language attribute to the HTML element in a document. It would, of course, have been easier for designers to include this information during the initial design process rather than having to address it after the fact. Many of the standard web development tools, including Adobe's Dreamweaver and Microsoft's Expression Web, ask developers to include ALT information during the development process, specifically to help designers make sure that they provide this information, though the tools are less likely to prompt designers to include language information. Some tools also have built accessibility analysis tools. Even without access to those tools, designers have free access to a wide range of automated analysis tools such as AChecker and WebAIM's WAVE, all of which will at least catch some of the more egregious errors, such as missing ALT elements. Neither the availability of these tools, nor the presence of the distilled guidelines provided by AITS-5302, seems to have mitigated Alabama's website accessibility issues. Nor, it is worth adding, does published research documenting the issue seemed to have helped—Potter's (2002) and West's yearly studies from 2001 to 2008 are all readily available. The question, then, is how to encourage the designers of Alabama state-level websites to address these issues. Lazar et al. (2004) found
that while many of the web professionals in their study were both aware of regulations regarding accessibility and aware of the availability of accessibility-related tools, many reported that the sites that they managed were either not accessible or were unsure of their site's accessibility status. The survey pointed to a number of potential “stumbling blocks” to improving accessibility, including time constraints, the need for additional training, and lack of managerial support. Farrelly (2011) found similar issues in an interview-based study of web managers. Given the problems with state-level accessibility in Alabama, these approaches may be a fruitful path in findings solutions to the problem. Another approach would be to increase monitoring and enforcement, though as Olalere and Lazar (2011) suggest, monitoring and enforcement, even at the federal level, have proved problematic. While there are clearly problems with many of Alabama's state-level websites, it is important to stress that Alabama is not alone in having these problems. Prior studies, such as West's longitudinal studies from 2001 to 2008 and Yu and Parmanto's national study (2011), have found accessibility to be a problem with many state websites. As discussed above, Yu and Parmanto (2011) also note that website accessibility is a problem in the corporate world, and that state and federal government sites often outperform commercial sites when it comes to accessibility. These snapshot studies, much like Potter (2002) and this article, offer the opportunity for tracking the progress of e-government accessibility. Tracking this development provides several benefits, suggesting that the study might be replicated for other state government websites. First, the results of these studies have the potential to provide state governments with a better sense of how accessible their websites are. When problems are identified, hopefully governments will take the opportunity to remedy them. Second, these studies can provide a foundation for researchers to take a more in-depth look into how the design decision-making process works and identify ways to improve the implementation of accessibility, both in instances in which accessibility has been found to problematic and in cases in which accessibility has improved.
Appendix A Organization/site name (website address 2012)
Accessibility errors 2003 (Potter, 2002)
Accessibility errors 2013
A
AA
AAA
508
A
AA
AAA
508
AITS
Alabama Board of Architects (http://www.alarchbd.state.al.us) Board of Nursing (http://www.abn.state.al.us) Crime Victims Compensation Commission (http://www.acvcc.state.al.us) Department of Agriculture and Industry (http://www.agi.state.al.us) Department of Corrections (http://doc.state.al.us) Department of Forensic Sciences (http://www.adfs.alabama.gov) Department of Human Resources (http://www.dhr.state.al.us) Department of Industrial Relations (http://dir.state.al.us) Department of Public Health (http://www.adph.org) Department of Public Safety (http://dps.alabama.gov) Department of Revenue (http://www.ador.state.al.us) Department of Senior Services (http://www.alabamaageline.gov) Department of Veterans Affairs (http://www.va.state.al.us)
1
3
2
1
2
1
0
2
1
1
2
2
1
0
0
0
0
1
1
3
3
1
2
2
0
0
0
2
4
2
2
4
1
1
2
1
1
2
3
1
2
0
1
1
1
1
4
3
1
2
1
1
0
2
0
2
1
0
3
1
0
2
1
0
2
4
1
0
0
0
0
1
1
2
2
1
5
1
1
3
1
1
2
3
2
3
2
1
2
2
3
3
4
3
4
1
0
2
1
1
3
3
1
3
1
1
1
1
1
2
2
1
4
1
0
1
2
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
485
(continued)A (continued) Appendix Organization/site name (website address 2012)
Accessibility errors 2003 (Potter, 2002) A
AA
AAA
508
Accessibility errors 2013 A
AA
AAA
508
AITS
Electrical Contractors Board (http://www.aecb.state.al.us) Emergency Management Agency (http://ema.alabama.gov) Home Builders Licensure Board (http://hblb.state.al.us) Alabama Homeland Security (http://homelandsecurity.alabama.gov) Judicial System Online (http://www.judicial.state.al.us) Law Institute (http://ali.state.al.us) Legislative Fiscal Office (http://www.lfo.state.al.us) (ACAS) Legislative Information System (http://alisondb.legislature.state.al.us/acas) Legislature (http://www.legislature.state.al.us) Medicaid (http://medicaid.alabama.gov) National Guard (http://alguard.state.al.us) Onsite Wastewater Board (http://aowb.state.al.us) Plumbers & Gas Fitters Examining Board (http://pgfb.state.al.us) Public Library Service (http://www.apls.state.al.us) Public Service Commission (http://www.psc.state.al.us) Real Estate Commission (http://www.arec.state.al.us) Shakespeare Festival (http://www.asf.net) State Auditor's Office (http://www.auditor.state.al.us) State Board of Auctioneers (http://www.auctioneer.state.al.us) State Board of Public Accountancy (http://asbpa.state.al.us) State Board of Social Work Examiners (http://abswe.state.al.us) State Council on the Arts (http://www.arts.state.al.us) Alabama.gov (http://www.alabama.gov) Attorney General (http://www.ago.state.al.us) Board of Heating & Air Conditioning Contractors (http://www.hvacboard.state.al.us) Children's Trust Fund of Alabama (http://ctf.state.al.us) Department of Archives & History (http://www.archives.state.al.us) Department of Children's Affairs (http://dca.state.al.us) Department of Conservation and Natural Resources (http://www.dcnr.state.al.us) Department of Mental Health and Mental Retardation (http://www.mh.state.al.us) Department of Transportation (http://www.dot.state.al.us) (ADECA) Economic and Community Affairs (http://www.adeca.state.al.us) Environmental Management (http://www.adem.state.al.us) Forestry Commission (http://www.forestry.state.al.us) Geological Survey of Alabama (http://www.gsa.state.al.us) Governor's Office on Disability (http://www.good.state.al.us) Governor's Office On National and Community Service (http://www.servealabama.gov)/2010/default.aspx)
1
1
2
1
1
1
0
0
0
1
3
3
1
3
1
0
1
1
1
1
2
1
1
0
0
1
2
1
3
3
2
4
1
1
2
1
1
3
4
2
2
1
1
2
1
1
1
1
1
5
1
1
3
2
1
3
2
1
3
1
1
1
2
1
1
1
1
4
1
1
2
2
1
3
2
1
5
1
1
3
1
1
3
2
1
5
1
0
3
1
2
2
2
2
6
0
1
3
1
1
3
3
1
3
1
1
1
1
1
2
2
1
2
0
1
1
1
0
1
3
0
2
1
0
1
1
X
X
X
X
5
0
0
3
1
1
3
3
2
2
2
1
2
1
3
3
5
2
2
1
0
2
1
1
3
3
1
3
1
1
1
1
1
2
3
1
2
1
0
1
1
1
2
2
1
1
1
0
0
1
1
2
2
1
2
1
0
1
1
1
3
2
1
3
1
1
2
2
0
0
1
0
0
0
0
0
1
1
3
4
2
0
0
0
0
0
1
1
2
1
1
0
0
0
2
1
4
3
1
1
1
0
1
2
1
2
4
2
5
1
0
3
1
2
3
4
2
4
2
0
3
1
3
3
3
3
5
0
1
2
1
1
2
3
1
2
1
0
1
1
0
2
2
0
4
0
1
3
1
1
1
3
1
5
0
0
3
1
1
4
3
1
4
1
1
4
1
3
3
4
4
7
1
0
3
2
1
2
1
1
2
0
1
2
1
0
3
2
1
3
1
0
1
1
2
2
3
2
1
1
1
0
0
(continued on next page)
486
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487
Appendix (continued)A (continued) Organization/site name (website address 2012)
Accessibility errors 2003 (Potter, 2002) A
AA
AAA
508
A
AA
AAA
508
AITS
Indian Affairs Commission (http://aiac.state.al.us) Lieutenant Governor (http://www.ltgov.state.al.us) (ABEMFT) Marriage and Family Therapy (http://www.mft.state.al.us) Office of the Governor (http://www.governor.state.al.us) Office of the Secretary of State (http://www.sos.state.al.us) Peace Officers Annuity Fund (http://www.apoabf.state.al.us) Peace Officers Standards and Training Commission (http://apostc.state.al.us) State Licensing Board for General Contractors (http://www.genconbd.state.al.us) State Board of Physical Therapy (http://www.pt.state.al.us) State Personnel Department (http://www.personnel.state.al.us)
1
3
3
1
3
1
0
2
1
2
2
3
2
5
1
0
3
1
1
2
2
1
3
1
0
1
2
0
1
1
0
2
1
0
2
1
0
2
3
0
4
1
1
3
2
0
2
2
0
1
1
0
0
2
0
2
2
0
3
1
0
2
1
1
2
2
1
2
0
0
0
1
1
2
2
1
4
1
0
1
1
0
2
2
0
0
0
0
0
1
11
1
0
9
5
16
36
13
4
Sites with no errors
Accessibility errors 2013
Caption: A comparison of 2003 and 2012 results.
References Alabama Interactive (2002, February). Network Manager Report, February 2002. Retrieved September 26, 2013 from. http://www.alabama.gov/PDFs/egov_pdfs/status_ reports/february2002_status.pdf Alabama Interactive (2005, May). Network Manager Report, May 2005. Retrieved September 26, 2013 from. http://www.alabama.gov/PDFs/egov_pdfs/status_reports/ may2005_status.pdf Alabama Interactive (2009, December). Network Manager Report, December 2009. Retrieved September 26, 2013 from. http://www.alabama.gov/PDFs/egov_pdfs/status_ reports/may2005_status.pdf Alabama Interactive (2013, August). Network Manager Report, August 2013. Retrieved September 26, 2013 from. http://www.alabama.gov/PDFs/egov_pdfs/status_reports/ August2013_status.pdf Americans with Disabilities Act, Title II, pt.35 § 35.190 (2010). Retrieved July 6, 2013 from. http://www.ada.gov/regs2010/titleII_2010/titleII_2010_regulations.htm#a35161 ATRC (2009). IDI web accessibility checker: Web accessibility checker. Retrieved August 20, 2012, from. http://achecker.ca/ ATRC (d). AChecker: IDI web accessibility checker. Retrieved August 20, 2012, from. http://atutor.ca/achecker/ Department of Justice (2012, September). Section 508 Report to the President and Congress: Accessibility of Federal Electronic and Information Technology. Retrieved June 10, 2013 from. http://www.ada.gov/508/508_Report.htm Donker-Kuijer, M. W., de Jong, M., & Lentz, L. (2010). Usable guidelines for usable websites? An analysis of five e-government heuristics. Government Information Quarterly, 27(3), 254–263, http://dx.doi.org/10.1016/j.giq.2010.02.006. Fagan, J. C., & Fagan, B. (2004). An accessibility study of state legislative websites. Government Information Quarterly, 21(1), 65–85, http://dx.doi.org/10.1016/j.giq. 2003.12.010. Farrelly, G. (2011). Practitioner barriers to diffusion and implementation of web accessibility. Technology and Disability, 23(4), 223–232, http://dx.doi.org/10.3233/TAD2011-0329. Fuglerud, K. S., & Røssvoll, T. H. (2012). An evaluation of web-based voting usability and accessibility. Universal Access in the Information Society, 11(4), 359–373, http://dx.doi. org/10.1007/s10209-011-0253-9. Gay, G., & Li, C. Q. (2010). AChecker: Open, interactive, customizable, web accessibility checking. Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A) (W4A '10) (pp. 2). New York, NY, USA: ACM, http://dx.doi.org/ 10.1145/1805986.1806019 (Article 23). Georgia Tech Research Institute (2009). GTRI|ELSYS|Human Systems Engineering Branch| Overview of State Accessibility Laws, Policies, Standards and Other Resources Available On-line. Retrieved July 18, 2013 from. http://accessibility.gtri.gatech.edu/sitid/ stateLawAtGlance.php Gilbertson, T. D., & Machin, C. H. C. (2012). Guidelines, icons and marketable skills: An accessibility evaluation of 100 web development company homepages. Proceedings of the International Cross-Disciplinary Conference on Web Accessibility (W4A '12) (pp. 4). New York, NY, USA: ACM, http://dx.doi.org/10.1145/2207016.2207024 (Article 17). Goette, T., Collier, C., & White, J.D. (2006). An exploratory study of the accessibility of state governments. Universal Access in the Information Society, 5(1), 41–50, http://dx.doi. org/10.1007/s10209-006-0023-2.
Hackett, S., Parmanto, B., & Zeng, X. (2004, October). Accessibility of Internet websites through time. ACM SIGACCESS Accessibility and Computing (No. 77–78, pp. 32–39). ACM. Hanson, V. L., & Richards, J. T. (2013). Progress on website accessibility? ACM Transactions on the Web (TWEB), 7(1), 2. Harper, S., & Chen, A. (2012). Web accessibility guidelines. World Wide Web, 15(1), 61–88, http://dx.doi.org/10.1007/s11280-011-0130-8. Jaeger, P. T. (2004). Beyond Section 508: The spectrum of legal requirements for accessible e-government web sites in the United States. Journal of Government Information, 30(4), 518–533, http://dx.doi.org/10.1016/j.jgi.2004.09.010. Jaeger, P. T. (2006). Assessing Section 508 compliance on federal e-government web sites: A multi-method, user-centered evaluation of accessibility for persons with disabilities. Government Information Quarterly, 23(2), 169–190, http://dx.doi.org/10.1016/j. giq.2006.03.002. Lazar, J., Dudely-Sponaugle, A., & Greenidge, K. (2004). Improving web accessibility: A study of webmaster perceptions. Computer in Human Behavior, 20(2), 269–288, http://dx.doi.org/10.1016/j.chb.2003.10.018. Lazar, J., Beavan, P., Brown, J., Coffey, D., Nolf, B., Poole, R., et al. (2010). Investigating the accessibility of state government web sites in Maryland. In P.M. Langdon, P. J. Clarkson, & P. Robinson (Eds.), Designing inclusive interactions, pt. 2 (pp. 69–78). London: Springer-Verlag Ltd, http://dx.doi.org/10.1007/978-1-84996-166-0_7. Lazar, J., Wentz, B., Almalhem, A., Catinella, A., Antonescu, C., Aynbinder, Y., et al. (2013). A longitudinal study of state government homepage accessibility in Maryland and the role of web page templates for improving accessibility. Government Information Quarterly, 30(3), 289–299. Li, S. -H., Yen, D. C., Lu, W. -H., & Lin, T. -L. (2012). Migrating from WCAG 1.0 to WCAG 2.0 — A comparative study based on web content accessibility guidelines in Taiwan. Computers in Human Behavior, 28(1), 87–96, http://dx.doi.org/10.1016/j.chb.2011.08. 014. Loiacono, E., McCoy, S., & Chin, W. (2005). Federal website accessibility for people with disabilities. Information Technology Professional, 7(1), 27–31. Olalere, A., & Lazar, J. (2011). Accessibility of U.S. federal government home pages: Section 508 compliance and site accessibility statements. Government Information Quarterly, 28(3), 303–309, http://dx.doi.org/10.1016/j.giq.2011.02.002. Potter, A. (2002). Accessibility of Alabama government web sites. Journal of Government Information, 29(5), 303–317. Rubaii-Barrett, N., & Wise, L. R. (2008). Disability access and e-government: An empirical analysis of state practices. Journal of Disability, Policy Studies, 19(1), 52–64. State of Alabama. State of Alabama – Cyber security – Search policies. Retrieved August 9, 2012, from. http://cybersecurity.alabama.gov/PoliciesStandards.aspx State of Alabama (2002a). Electronic government framework and strategy. Retrieved September 27, 2013 from. http://www.alabama.gov/PDFs/egov_pdfs/ e-govFrameworkandStrategy.pdf State of Alabama (2002b). Alabama.gov. Retrieved September 26, 2013 from. http://web. archive.org/web/20021001084124/http://www.alabama.gov/default.aspx State of Alabama (2011a). alabama.gov: The Official Website of the State of Alabama. http:// web.archive.org/web/20110716193758/http://www.alabama.gov/portal/index.jsp State of Alabama (2011b). Information Technology Standard 530S2-00: Universal accessibility. Retrieved October 11, 2011, from. http://cybersecurity.alabama.gov/ documents/Standard_530S2_Universal_Accessibility.pdf State of Rhode Island (2013). RI.gov: Accessibility policy. Retrieved July 18, 2013 from. http://www.ri.gov/policies/access.php
N.E. Youngblood / Government Information Quarterly 31 (2014) 476–487 State of Vermont (2013). Vermont.gov – Policies – Accessibility policy. Retrieved July 18, 2013 from. http://www.vermont.gov/portal/policies/accessibility.php U.S. Access Board (2011). 2011 ANPRM Draft. Retrieved August 8, 2012 from. http:// access-board.gov/sec508/refresh/draft-rule.htm U.S. Access Board (d). About the United States Access Board. Retrieved August 8, 2012 from. http://access-board.gov/about.htm U.S. Census Bureau (d). American Community Survey, 2008. Retrieved June 10, 2012 from. http://factfinder2.census.gov United Nations (2006). Convention on the rights of persons with disabilities and optional protocol. Retrieved from June 14, 2012, from. http://www.un.org/disabilities/ documents/convention/convoptprot-e.pdf United Nations (2007). UN Enable—Convention on the rights of persons with disabilities. Retrieved from June 14, 2012, from. http://www.un.org/disabilities/default.asp? navid=12&pid=150 Vanderheiden, G. C. (1995). Design of HTML (Mosaic) pages to increase their accessibility to users with disabilities strategies for today and tomorrow. Retrieved June 18, 2012, from. http://trace.wisc.edu/archive/html_guidelines/version1.html Vanderheiden, G. C. (2009). Quantification of accessibility: Guidance for more objective access. In C. Stephanidis (Ed.), Universal access in human–computer interaction. Addressing diversity (pp. 636–642). Berlin: Springer-Verlag, http://dx.doi.org/10.1007/ 978-3-642-02707-9_72. Vanderheiden, G. C., & Chisholm, W. A. (1998). Central Reference Document — Version 8. Unified web site accessibility guidelines. Retrieved June 18, 2012, from. http://www. w3.org/WAI/GL/central.htm W3C (1997). World Wide Web Consortium Launches International Program Office for Web Accessibility Initiative. Retrieved May 15, 2013, from. http://www.w3.org/ Press/IPO-announce.html W3C (1999). Web Content Accessibility Guidelines 1.0. Retrieved July 5, 2012, from. http://www.w3.org/TR/WCAG10/ W3C (2008). Web Content Accessibility Guidelines (WCAG) 2.0. Retrieved June 13, 2012, from. http://www.w3.org/TR/WCAG W3C (2009). Comparison of WCAG 1.0 checkpoints to WCAG 2.0, in numerical order. Retrieved July 5, 2012, from. http://www.w3.org/WAI/WCAG20/from10/comparison/ W3C (2010). Accessibility — W3C. Retrieved June 12, 2012, from. http://www.w3.org/ standards/webdesign/accessibility
487
West, D.M. (2001). State and federal e-government in the United States, 2001. Retrieved August 20, 2013 from. http://www.insidepolitics.org/egovt01us.PDF West, D.M. (2002). State and federal e-government in the United States, 2002. Retrieved June 4, 2012 from. http://www.insidepolitics.org/egovt02us.pdf West, D.M. (2003a). Global e-government, 2003. Retrieved August 20, 2012 from. http:// www.insidepolitics.org/egovt03int.pdf West, D.M. (2003b). State and federal e-government in the United States, 2003. Retrieved August 20, 2013 from. http://www.insidepolitics.org/egovt03us.pdf West, D.M. (2004). State and federal e-government in the United States, 2004. Retrieved August 20, 2013 from. http://www.insidepolitics.org/egovt04us.pdf West, D.M. (2005). State and federal e-government in the United States, 2005. Retrieved August 20, 2013 from. http://www.insidepolitics.org/egovt05us.pdf West, D.M. (2006). State and federal e-government in the United States, 2006. Taubman Center for Public Policy, Brown University (Retrieved October 11, 2011, from http:// www.insidepolitics.org/egovt06us.pdf). West, D.M. (2007). State and federal electronic government in the United States. (Retrieved September 3, 2011, from http://www.insidepolitics.org/egovt07us.pdf). West, D.M. (2008). State and federal electronic government in the United States, 2008. The Brookings Institution (Retrieved October 11, 2011, from http://www.brookings.edu/~/ media/Files/rc/reports/2008/0826_egovernment_west/0826_egovernment_west.pdf). Youngblood, N. E., & Mackiewicz, J. (2012). A usability analysis of municipal government website home pages in Alabama. Government Information Quarterly, 29(4), 582–588 (http://dx.doi.org/10.1016/j.giq.2011.12.010). Yu, D. X., & Parmanto, B. (2011). U.S. state government websites demonstrate better in terms of accessibility compared to federal government and commercial websites. Government Information Quarterly, 28(2011), 484–490, http://dx.doi.org/10.1016/j. giq.2011.04.001.
Norman E. Youngblood teaches interactive media in the School of Communication & Journalism at Auburn University and he is a co-director of the Laboratory for Usability, Communication, Interaction and Accessibility. His recent publications include “A usability analysis of municipal government website home pages in Alabama,” published in Government Information Quarterly, and Multimedia Foundations, a coauthored textbook on digital media design and production.