Data-based Decision Making

Value the use of data as the starting point for professional work.

Context

Artifact

Class EDTEC 590, Evaluation Techniques for the Performance Technologist
Instructor Marcie Bober-Michel, Ph.D.
Project Evaluation of the SDSU Department of Educational Leadership (EDL) website (shown below)
Artifact Evaluation Report

 

Cover of the Evaluation Report

Janet Saman and I were assigned the opportunity to perform an evaluation of the SDSU Department of Educational Leadership (EDL) website. Our task was to evaluate the current website by confirming user perceptions of site content, structure and navigation and to provide recommendations for improvement.

Connection to the Standard

The Proactive (or “up-front”) evaluation approach we used for this project exemplified the data-based decision making standard. This approach is particularly appropriate when an existing product is in need of a major review to provide findings that aid in making decisions about radical changes to it (Owen, 2007). As part of this approach, we performed three data gathering methods:
  • Literature Review: We identified data collection methodologies used in other website redesigns and identified generally accepted standards associated with designing websites. This positioned us to build instruments aligned to the needs of the evaluation.
  • Survey: We created a 13-question survey in Survey Monkey and distributed it to students, faculty, and staff to determine:
    • How they used the website,
    • If it provided information they needed,
    • If they were able to locate the information they needed, and
    • The impression they had regarding the EDL program after visiting the website.
  • Benchmarking Study: Because our client had placed great importance on ensuring visitors to the EDL website had a meaningful experience, we developed a benchmarking checklist particularly focused on evaluating how well a website's design would engage a targeted audience. Our evaluation team, plus two independent evaluators used the checklist to review the SDSU, UCLA and SJSU educational leadership websites.

Through our analysis of the survey and benchmarking results, we identified significant discrepancies between the current condition of the EDL website and the user’s needs. We documented our findings and recommendations in an evaluation report that was provided to our client. The following priorities were recommended for the website redesign:

  • Promote the EDL program's brand on the home page,
  • Correct all broken links and remove links to incomplete pages,
  • Add information identified by survey respondents as being the most needed, and
  • Improve its professional appearance by applying the web design principles of proximity, repetition, alignment, and contrast (Williams and Tollett, 2006).

Challenges & Learning Lessons

One problem we encountered was that the availability of the survey group was impacted due to summer vacations. Our initial survey email yielded only 22 out of 209 responses. To overcome this, we issued a reminder that resulted in a statistically sufficient 25% response rate. We also encountered a challenge with converting our benchmarking results into meaningful findings because we had not required the testers to explain their ratings. Although we were able to use Excel’s conditional formatting feature to visually highlight how well each website met the benchmarking criteria, the lesson learned for future studies was clear: make sure the tester describes any best practices observed.

What It Showcased About Me

In addition to demonstrating the ability to apply an evaluation methodology to help a client make decisions about an existing program, I also demonstrated the ability to use various resources to build data gathering instruments.

Future Application

This project gave me a strong appreciation for evaluation and helped me realize that the skills I gained from this degree program could be applied to more than instructional design. In fact, I gained significant knowledge and awareness of website design principles and best practices which I can use professionally to either design websites or evaluate them.

References

Owen, J.M. (2007). Program evaluation: Forms and approaches (3rd ed.). New York: The Guilford Press.

Williams, R. & Tollett, J. (2006). The non-designer’s web book (3rd ed.). Berkeley, CA: Peachpit Press.