The Washington Post recently ran an article about making information on federal web pages more findable. I wish I was more technically savvy so I could really understand the difficulties, but I think it has to do with how deeply buried the pages are. The article cites as an example the user having to fill out a form before certain information can be accessed, and those pages aren't visible to web crawlers. There's an effort going on to improve the situation because, as we know, if people can't find something on Google they assume it doesn't exist. In some cases, if the user goes to the federal agency's website they can get at the information, but many people don't know anything beyond Google.
This reminds me of a session I attended at a Webwise conference in 2007 on a slightly different topic, but still about accessibility. The presenters talked about the Web-at-Risk project that tries to archive federal web pages before they disappear. Their example was "numerous Web sites and blogs that emerged in the aftermath of Hurricane Katrina (to help direct aid and provide real-time information for the affected area)" that were fast disappearing. It was a very interesting session and if you're interested in reading about it, it's on page 12 of the Webwise proceedings link that's linked above.
No comments:
Post a Comment