Thursday, May 7, 2009

Welcome


Our e-portfolio



Welcome to our e-portfolio. From this page you can navigate our e-portfolio corridors just by clicking once on provided links. These corridors consist our projects and assignments that we have developed for TECH4102 course (Evaluation in Educational Technology).
Best wishes and enjoy your trip with us.

By:
Asma (63376)
Fatma (63249)
Course instructor: Dr. Alaa Sadik

ILT Dept, SQU



Our Corridors:

1. Evaluation in Educational Technology ( study1, study2)
Here are summaries of two evaluation studies in educational technology. The first summary focuses on the evaluation methodology used to evaluate the quality of the output of four state of the art French Text-to-Speech synthesis system, in terms of purpose and used instruments. The second focuses on the evaluation of network features.

2. Levels and techniques of evaluation in educational technology
Define the level and techniques of evaluation in investigating the use of advance organizers an instructional strategy for web-based distance education.

3. Models of evaluation in educational technology.
Using Bader ALKahn model to evaluate study skills course.

5. Evaluation of CAI
A developed a checklist that can be used to evaluate the navigation, structure, instructional aspect, practice/assessment/feedback and ease of use of an instructional software.

6. Evaluation of online learning
A short paper describing the procedures and strategy that we applied to evaluate an online services provider, VBulettin company in term on resources and support.

7. Comparative and non-comparative evaluation in educational technology.
A short essay to summarize the strategy applied to evaluate educational technologies using comparative (performance studies) and non-comparative approaches.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

Comparative and non-comparative evaluation in educational technology

In this assignment we summarized two types of studies in IT filed which are performance comparative and non-comparative study. The former was entitled as "Information retention from PowerPoint™ and traditional lectures". It conducted to investigate the effects of PowerPoint’s on student performance (e.g., overall quiz/exam scores) in comparison to lectures based on overhead projectors, traditional lectures (e.g., “chalk-and-talk”), and online lectures. To carry out the study the researchers used two delivery styles presented in the lectures, traditional and PowerPoint. A third presentation category, no class, was formed with the students that were not present during either of the delivery styles. This approach was used to their effects on the pre-defined variables such as: the quiz measured performance, recognition of graphic information, recognition of auditory information, recognition of audio/visual information, overall recognition of information – This was measured by the percentage of correct answers pertaining to the information provided during the lectures.

The latter study, non-comparative study, was entitled as" Does the amount of on-screen text influence student learning from a multimedia-based instructional unit?" .It was designed to examine how changes in the amount of on-screen text will influence student learning from a multimedia instructional unit on basic concepts of coordinate geometry. In order to measure the influence of the amount of on-screen text on students learning, performance gains and retention levels of the participants were assessed through performance measures.
These performance measures were based on the school’s curricular objectives and the instructional objectives of the multimedia unit.
The timing and the format of the tests were based on the school’s regular testing procedures (for example, short answer items were preferred). Two parallel forms were used to assess performance gains (pre post-tests) and a shortened form of these tests was used to assess the level of retention.

Click Here a PowerPoint format.


References:

1. Non-comparative Study:
Arda, D. & Unal, S. (2008). Does the amount of on-screen text influence student learning from a multimedia-based instructional unit?. Springer.
2. Comparative Study:
Savoy, A. and others. (2008). Information retention from PowerPointTM and traditional lectures. Computers & Education. 52

Evaluation of Online Learning

Vbulletin is a famous company that provides forums' platform software. It is located in the UK, but has a website for purchasing its products and getting support. Most of the people around the world buy and contact the company through the internet. This paper will mainly focus on evaluating the support services for this company, which provides presale and sale support. The support here is delivered by a variety of ways: Telephone, Fax, mail, online ticket system and e-mail. Here are two screen shots of the support environment in Vbulletin:



Purpose:
Evaluation of services provided for users of the company products is very significant to assure high quality, especially in the business sector. Support services can be a reason for grabbing or losing customers. Thus, this evaluation explores the level and kind of support provided by Vbulletin company to its licensed users.

Questions:
· What are the support services provided from Vbulletin company to its users?
· Are the support services effective in the way that they are serving their purpose?


Evaluation Instruments:
In order to evaluate the support services provided by Vbulletin Company for its licensed users, an evaluation checklist and interview questions were developed. We started with developing the instruments based on Badr Al-Khan model. We also did brainstorming and browsed different websites for companies that provide support to get ideas of what and how the support (in such companies) should be. We developed the checklist first, but it wasn't adequate to evaluate the support features, which lead us to develop interview questions and contacting one of the software licensed users to conduct an interview.
Checklist link: http://www.surveyconsole.com/console/TakeSurvey?id=569588
Interview link: http://www.surveyconsole.com/console/TakeSurvey?id=570390


Participants:
We –writers of this paper- played the role of external evaluators by using the checklist to see what are the support aspects provided from the company to the users. In addition, we contacted one of the software users in this company to get her responses to the interview questions we developed.


Results:
Checklist results (note: the checklist has been used only once by the external evaluators since it evaluates facts not opinions):
http://www.surveyconsole.com/console/listSummaryReport.do?enableStatisticalDetails=true
Interview:
http://www.surveyconsole.com/console/listTextReport.do


Analysis of the results:

The checklist:
The checklist results indicate that the company provides a variety of support services for its users through various means, which are online ticket system. e-mail, discussion forums, telephone, fax and mail. The users are provided with a user manual as they buy the software to get instructions on how to install, run, maintain and using its various functionalities. The support services are available 24 hours daily and 7 days weekly. In addition, Frequently Asked Questions are available for users. However, the company doesn't provide instructions for troubleshooting in advance.

The interview:
The user seems to be satisfied of the support services provided by the company in terms of getting fast online help, recent software updates and electronic manual. Some weaknesses were found in the support aspect of the company as what the interviewee stated. The level of English Language used is higher than the English level for some software users. In addition, the technical level of the instructions was also perceived high by the user.

Advantages of the Technology:
Providing various means for support to ensure the accessibility.
Providing a user manual to enable users using features of the software
The round-clock availability of support
The fast responses for users' questions
Providing FAQ available online

Limitations of the Technology:
No instructions for troubleshooting in advance
High level of English Language
High level of technical instructions

Discussion:
The company provides good support services needed for a user in different means. However, there are some areas that need to be improved to ensure high quality of support and consequently the company as whole. The level of English Language used is higher than the English level for some software users. To overcome that, I'd suggest the company to simplify the English level in all the support sides or to provide a manual in different languages to make it accessible to a larger portion of people.
The technical level of the instructions was also perceived high by the user. The company should be so careful and consider the user's technical level when providing instructions.

Appendixes:

The Checklist:
Do users get a user manual when they buy the software?
Does the company provide demos of the software?
Do users receive guidance on any of the following skill(s):
- installing vbulletin software
- Upgrading the software
- Posting announcement to Users
- creating forums and subforums
- e-mailing users
- Moderating (editing, deleting, moving and sticking) Posts in the forum
- Adding attachment
- Forum Maintenance
Does the company provide someone other than the seller who can assist users’ problems regarding technical aspects?

Does the company provided FAQ service?
Does the company provide instructions for troubleshooting in advance?


Does the company provide round-the-clock (24/7) technical support?

Does the company provide a searchable glossary for help content?

Does the company provide the contact means:
Mail
Telephone
discussion forums
Fax
E-mail

Does the company notify the users with the latest updates for the software?


Interview questions and the response one of the service's user:

Do you find the level of English Language used in the technical support difficult to be understood?
Sometimes. I used to contact the company several times, and I faced some difficulties in understanding some words in the responses I got

Does the company provide you with recent developments and software updates?

Yes, I always get e-mails from the company with the recent developments and updates of the software. I also find announcements in the companies forum about the developments

What are the difficulties you face when using the software and how does the company help you in that?
Sometimes I want to do things I see people doing in their forums, but I don't know how to do them and what are the proper terms for searching about these features. The company provided me in such cases with e-mail to contact and get support. It also provide forums where the software users meet and exchange experiences.

Does the company provide you with an electronic user manual?
Yes, and it has a lot of the features for managing the forums' software

If yes, is the manual comprehensive (containing all the needed information to install, use and maintain the software)?
I think so, but I face a problem in finding the information I need from the manual

Have you contacted the company asking for help/support?
Yes, I did several times

If yes, How long did you wait to get their response?

I got the replies maximum in 2 hours

Did you find the instructions you got easy to follow?
Not always.. Once they respond to my question with a complicated instructions and I didn't know what to do

Was your question or issue resolved the first time you contacted technical support?
In most of the times, yes


Slideshow: http://www.slideshare.net/Asma44/evaluation-of-online-environment-1412793


References:
Khan, B. (2005). Managing E-learning Strategies. California, United States: Idea Group Inc.
Vbulletin Company:
http://www.vbulletin.com/
Survey Console Website:
http://www.surveyconsole.com/


Levels and techniques of evaluation in educational technology

Chen and others conducted a study in 2007 on investigating the use of advance organizers as an instructional strategy for web-based distance education. It was conducted at the a curriculum level, undergraduate course of health care ethics. Three instruments, techniques, were used are: post-test1, post-test2 (to measure short-term and long-term effects) and students survey.


Reference:
Chen, B; Hirumi, A. & Zhang, N.J. (2007). Investigating the Use of Advance Organizers as an Instructional Strategy for Web-based Distance Education. The Quarterly Review of Distance Education. 8 (3). Online:
http://web.ebscohost.com/ehost/pdf?vid=2&hid=6&sid=8e908a64-c9cf-4adc-8e28-9da845565fe9%40sessionmgr7 (accessed on 26/9/2008)

Evaluation strategies

To evaluate educational software, we use a checklist from the internet to gather audience opinions. This checklist was developed from published rubrics developed by Kristin Miller and Jacqueline Bach. It consists several main points which are: instructional content, curriculum connections, graphics/multimedia, layout, technical aspect, accessibility, interactivity/engagement, teacher and earner support material, assessment and flexibility.



Reference:
http://SASinSchool.com

Saturday, April 25, 2009

Comparative Study

Information retention from PowerPoint™ and traditional lectures
By: April Savoy , Robert W. Proctor, and Gavriel Salvendy

Introduction:
The study investigates the effects of PowerPoint’s on student performance (e.g., overall quiz/exam scores) in comparison to lectures based on overhead projectors, traditional lectures (e.g., “chalk-and-talk”), and online lectures. The present study decomposes overall quiz scores into auditory, graphic, and alphanumeric scores to reveal new insights into effects of PowerPoint presentations on student performance.
The objective of this study is to determine how to present information effectively for maximum retention. Data were collected from 62 students via Quiz and questionnaire.
Research approach:
There were two lectures presented during the experiment. Each lecture was presented using traditional and Power-Point delivery styles.
Lecture 1 was titled ‘‘Attention”. The material covered different attention models (i.e., bottleneck models) and concepts of
filter theory, attenuation theory, and issues of perceptual load.
Lecture 2. was titled ‘‘Memory Stores and Working Memory”.

The researchers used two delivery styles presented in the lectures, traditional and PowerPoint. A third presentation category, no class, was formed with the students that were not present during either of the delivery styles. They measured their effects on the following variables:

1. The quiz measured performance
2. Recognition of graphic information
3. Recognition of alphanumeric information – This was measured by the percentage of correct answers pertaining to the alphanumeric information
4. Recognition of auditory information
5. Recognition of audio/visual information – This was measured by the percentage of correct answers pertaining to information that was
6. presented auditorlly and visually.
7. Overall recognition of information – This was measured by the percentage of correct answers pertaining to the information provided during
8. the lectures.
9. Preference – The questionnaire measured preference, where delivery style preference and perceived importance were the dependent variables.

Findings:
Students retained 15% less information delivered verbally by the lecturer during PowerPoint presentations, but they preferred PowerPoint presentations over traditional presentations.

Non-Comparative Study

Does the amount of on-screen text influence student learning from a multimedia-based instructional unit?

Summary:

This study examines how changes in the amount of on-screen text will
influence student learning from a multimedia instructional unit on basic concepts of coordinate geometry. The relative effectiveness of two different versions (short-text and
whole-text) of the instructional unit was examined for students who differed in terms of their ability to remember symbolic units, symbolic systems and symbolic interpretations. A total of 101 seventh graders were randomly assigned to work with either the whole-text or the short-text version. Student gains were analyzed using pre-test, post-test and retention test scores. Memory ability was assessed by the sub-tests of the Structure of Intellect-Learning Abilities Test. Results indicated no significant differences between groups who worked with short-text and whole-text versions. However retention scores of high and low memory groups who worked with the whole-text version showed significant differences.


The whole-text version was observed to favor students with high memory for symbolic implications. Results suggest that workability of design principles for multimedia instruction may depend on the nature of the task and characteristics of the learner.



Evaluation Approach:

In order to measure the influence of the amount of on-screen text on students learning, performance gains and retention levels of the participants were assessed through performance measures. These performance measures were based on the school’s curricular objectives and the instructional objectives of the multimedia unit. The timing and the format of the tests were based on the school’s regular testing procedures (for example, short answer items were preferred). Two parallel forms were used to assess performance gains (pre post-tests) and a shortened form of these tests was used to assess the level of retention.

Reference:
http://www.springerlink.com/content/2jn56259qq781h53/fulltext.pdf