Oregon Family Literacy Logo

Classroom Evaluation

Back to Table of Contents

Classroom Evaluation Criteria:

1. What is to be measured in the classroom to evaluate the effectiveness of the pilot site?

2. Was there a pre survey?

3. Is there a post survey?

4.  What kind of evidence will be collected that shows student/learner outcomes have been achieved through the use of LINCS and technology?

5. How many sessions were conducted?

6. How many students/learners/teachers did you plan on participating?

7. How many actually participated?  Explain.

8. Was there a formal feedback process incorporated in the project?

9. What were challenges in the pilot site classroom?

10. What were the shortcomings in the pilot site classroom?

11. How did you address challenges?

12. Is there strong evidence to suggest that there has been a change in student behaviors pertaining to use of technology in the classroom or their studies?

Evaluation 1

Oregon Pilot Site 2003

Lesson 7

Evaluation 2

Oregon Pilot Site 2003

Lesson 5

Evaluation 3

Oregon Pilot Site 2003

Lesson 4

Evaluation 4

Oregon Pilot Site 2003

Lesson 10

Oregon Pilot Site 2003 Lesson Plans

 

 

 

1. What is to be measured in the classroom to evaluate the effectiveness of the pilot site?

 

This year we took a first step to ask instructors about the value of these lessons to teachers and students in family literacy programs.  The evaluation forms consisted of seven questions to help us gain insight into how these lessons translated into real classroom situations:

1.      classroom context: ESL, parenting, interactive activities

2.      lesson format

3.      clarity of directions in plan

4.      adaptation or expansion by user

5.      student response

6.      what instructor learned from students through use of lesson

7.      other comments

From these responses we hoped to garner as much information as each instructor could give on specifics of what would be valuable to measure. Some results:

1.   increased technology awareness and skills (could measure both students and instructors)

2.   time spent on task (some lessons are long-term projects and won’t work well without the necessary time allotted)

3.   new knowledge of students that instructors gained by doing the activity

4.   student response to activity (content, structure, personal gain, family gain, how experience will enrich home activities and future parent/child interaction)

5.   attendance/ retention, measured by number of students in classroom who completed activity

6.   limits of on-site technology and creative adaptations to suit real situation

 

2. Was there a pre survey?

No

 

3. Is there a post survey?

Post survey is evaluation sheet. Next phase of measurement should be creating pre- and post-evaluations which measure items listed above and others considered worthy of measurement.

 

4. What kind of evidence will be collected that shows student/learner outcomes have been achieved through the use of LINCS and technology?

Some ideas:

  1. student feedback on activity (content, structure, personal gain, family gain, how experience will enrich home activities and future parent/child interaction)

  2. attendance/ retention, measured by number of students in classroom who completed activity

  3. instructor assessment of individual learner

  4. long-term evidence: student reporting of  incorporation of knowledge gained in post-lesson real-life practice.

5. How many sessions were conducted?

Unknown

 

6. How many students/learners/teachers did you plan on participating?

As many as could respond in our two individual programs.

 

7. How many actually participated?  Explain.

Four instructors, two from West Washington County Family Literacy Collaborative and two from LISTO.

 

8. Was there a formal feedback process incorporated in the project?

Yes. Written evaluations and comments by instructors

 

9. What were challenges in the pilot site classroom?

Low-level computer skills of a number of students

With use of cameras, some students slow to complete picture-taking at home

 

10. What were the shortcomings in the pilot site classroom?

Low technology or limited computers

 

11. How did you address challenges?

Only use computers in part of lesson, adapt to use without computers.

 

12. Is there strong evidence to suggest that there has been a change in student behaviors pertaining to use of technology in the classroom or their studies?

Instructors report:

One final note: Technology was a tool, not the focus of activities. When the focus of activity is meaningful to students, use of technology becomes more natural and integrated into total learning.

 

Back to Table of Contents

Back to top