Tag Archive: ECUR809


Assignment 4 – Logic Model

Logic Model: Diverse Learner Teacher (DLT) Program

 

Click the above image for a larger version.

Assumptions: -that all stakeholders are committed to the DLT program and its success.  Success is measured by meeting the needs of the diverse learners in our school population.  What does this look like?  That students are meeting their full potential and that through adaptations and modifications students can meet curriculum outcomes.  For any reason if students are unable to meet the curriculum outcomes, the DLT program will allow students to meet their full potential.  The second assumption is that all stakeholders are committed to the principles of Universal Design for Learning and what that looks like and means to our students.

External factors: Factors to the success of the DLT program include time, place,  and the level of involvement of classroom teachers.  The involvement of other students, language, length of time in Canada and attitude also will be contributing factors for success.

Inputs: There are several inputs to the DLT progam.  Most important are the DLT’s themselves.  They play a key role in the success of the program, and as you can see from the logic model above, influence every aspect of the rest of the model.  They are guided by the principle of Universal Design for Learning.  Supporting them in their school-based roles are administration, DLT consultants at the district level, and fellow DLT’s at various locations in the district.  Other inputs to the DLT program include classroom teachers, students, outside agencies, as well as guiding documents.

Outputs–> Activities: The activities can be summarized into 3 key areas.

1) Working with students – direct interactions between DLTs and students.  This can include one-to-one support, small group support, or large group support.

2) Working FOR students – this is an administrative role on assessment, observation and intervention.  This can include reccomendations for special placements and modified/adapted programming.

3) Working with teachers – This role supports teachers in their day-to-day practice and to help them design best-practices related to UDL.

Outputs –>Participation:  From the activities, participation includes:

-DLT team meetings.

-Individualized Program Plan (IPP) Workshops.

-Completed forms for Alberta Education and for District records.

Outcomes/Impacts

The primary goal of the DLT program is to improve student academic acheivement with a secondary goal of improved over-all student well-being.  All short and medium outcomes, lead to these end-goals.

Medium outcomes are based with outside agencies (such as Inter-cultural Wellness Workers, and Occupational Therapists and Physiotherapists),  formalized learning plans (IPP’s/SAP’s), and improvements in teacher best practice (technology integration and professional development in UDL).

Short-term impacts are closely related to the participation of stakeholders.  These include classroom visits, filling out the appropriate paper work, developing student action plans, and working with teachers on exemplars and examples of best-practice in the classroom.

Program Evaluation Review

Here we are in 2012 and I have a new batch of courses that I’m taking.  I just finished up Instructional Design and Designing for Distance Education.  My next two courses are Program Evaluation and Advanced Instructional Desgin.  Completeion of these two courses will put me over the 1/2 way mark in my Master’s program.  That’s hard to believe!  This first blog entry (in a LONG time) is a program evaluation review.  Over the coming months, I’ll be posting assignments here on my blog.  I encourage comments!

J.T.

Review of “The Brandon Middle Years Pilot Project Program Evaluation” (BMYPPPE)

The BMYPPPE was completed by David Patton, Ph D. and Jackie Lemaire B. Sc in August of 2002.  It is an evaluation of a Pilot Program taking place in two Brandon Junior High Schools whose goal it was to increase the knowledge of grade 7 and 8 students in alcohol and drug use, and to see if increased knowledge would change students’ decision making skills.  I found the report after a Google search of “middle years” and “program evaluation”.   Since middle years are my area of interest, I wanted to find a report that dealt with that target group.

What I liked about the Report:

The report was well organized and properly formatted.  It was official and professional.  It followed the general program evaluation format of the title page, table of contents, introduction, research results, summary, and appendices.  There were no grammatical errors and the findings were reported at an understandable language level.

Findings were presented in graphs.  The findings of each evaluated criteria were presented as text first, with a graph to unify the data.  The graphs were clear of any superfluous data.  They were well sized (they did not take up an entire page) and communicated the findings clearly.

The data was analyzed and compartmentalized according to specific criteria.  The program evaluation took into account the differences in:

  •         The two middle schools observed
  •          Gender differences
  •          Difference between Grade 7 and Grade 8 students
  •         Differences (in use and opinion) between alcohol and drug use

 

I liked how the program was evaluated using different question techniques.

  •         Questions on access to drugs and alcohol were asked using a Likert Scale.
  •         Questions on frequency of drug and alcohol use were asked using a Likert Scale.
  •         Students were also asked short answer and open ended questions when it came to questions about personal feelings.  I like this technique as it allows the respondent to speak as an individual and there are no ‘guiding’ questions or prompts.
  •         Questions regarding specific knowledge pertaining to the knowledge from the Pilot Project (such as the addictiveness of cigarettes, or the legality of certain drugs) were asked using True/False.  I liked how they also had a third option of “I don’t know” to allow for students to be honest in their knowledge.  This is something that I will take with me when I conduct my own program evaluation.

Here is what I did not like about the report:

I found that the report did not address ethnic diversity – cultural backgrounds or the socio-economic backgrounds of the students.  This could have been addressed in the initial introduction where a ‘snapshot’ of the two schools involved in the survey could have been given.  I was left asking questions such as, “Have more schools taken place in this pilot project?”, “Why were these two schools chosen?”, “Is there an existing drugs and alcohol problem which this pilot project was designed to address?”, and “Is there a senior high follow up to this pilot project?”.

Although I like the use of the Likert Scale, I was left wondering if questions regarding attitudes (and changing attitudes) would have been better addressed in a short answer format.  This led me to another question/idea of wondering if there was room in the evaluation to have completed some pre and post interviews with students.  This may not have been a possibility because of the sensitive nature of subject matter and the possibility exists of unauthentic responses in an in-person interview at this age.

J.T.

Here is a link to the Program Evaluation

http://bit.ly/xiEomK