Assignment 3 – Program Planning Worksheet

Program to be Evaluated: The Diverse Learning Teacher Program

Engage Stakeholders

Who should be involved?

Primary Stakeholders

  • School Administration team (Principal, Vice Principal, and Assistant Principal)
  • The Diverse Learning Teachers (DLT)
  • Classroom teachers
  • Teacher Assistants
  • Students directly affected by the DLT program (those on Individualized Program Plans, S.A.P.’s, etc)
  • Parents directly affected by the DLT program (those who have day-to-day interactions with DLT’s)

Secondary Stakeholders

  • Non-classroom teaching staff (option teachers, counselor, Career and Technology Studies teachers)
  • Students not directly affected by the DLT program
  • Parents not directly affected by the DLT program
  • DLT Consultants (Central Office)
  • DLT’s at other schools

How might they be engaged?

The primary and secondary stakeholders will be engaged through a series of surveys, interviews and consultations.  The data from these assessment tools will be aggregated into a final recommendation report.  Some of the primary stakeholders will also participate in the creation of the surveys and data collection.  Teaching staff will be asked for anecdotal feedback to the DLT program, and students will provide feedback on their experiences in the DLT program.  DLT Consultants may be ask to provide information on the District’s vision of how DLT’s are to operate at their schools and work-sites.  DLT’s operating at other work-sites will provide information on how the DLT program works at their school.

Focus the Evaluation

What are you going to evaluate?  Describe the program (logic model).

The program being evaluated is the Diverse Learning Teacher program that was introduced to our school September 2011.  The Calgary Catholic School District recently decided to adopt a path that includes Universal Design for Learning (UDL) and the Alberta Education document “Making a Difference: Meeting Diverse Learning Needs with Differentiated Instruction”.  The DLT program replaced the former model of the Resource Teacher, the English as a Second Language teacher, and any other ‘resource’ teacher with a single, unified position – the Diverse Learning Teacher.  Replacing the former model with the new model required individual schools to restructure the way support services were delivered to teachers and students.  Each school chose to model the DLT program differently – keeping their unique situations in mind.  In our school, the changes were far-reaching.  There was a change in physical space (where the DLT’s were housed), time-table changes, and most importantly, a change in “the way we do business”.  The DLT’s were given extensive professional development at the beginning of the school year.  One DLT is the DLT Coordinating Teacher (DLT-CT) and is required to attend a matrix meeting once in a 6-day cycle.  Information gathered at these meetings is shared amongst the other DLT’s at the school.

One far-reaching change from the previous year is that the barriers between Resource Teachers and English as a Second Language (ESL) teachers have been removed.  The DLT’s are now in a position to support ALL students, regardless of their needs.  These staff members also support the classroom teacher to deliver quality programs that adopt the philosophy of Universal Design for Learning (UDL).  The end-goal is to meet the unique requirements of every student – not just those who have been identified as ‘ESL’ or coded.

What is the purpose of the evaluation?

The purpose of this evaluation is to gather feedback on current practice – the ‘what is going on right now’ at the school.  The evaluation will gather qualitative and quantitative data and make recommendations on what is working best, what is not working, and what is being done that can be done better.

Who will use the evaluation?  How will they use it?

The evaluation will be used by the Administration Team and the DLT team at the school to make revisions and changes to the program for the 2012-2013 school year.  It will be used to improve DLT service provided to students, classroom teachers, and parents.

1. Administration – Administration will use the report to make staffing and timetabling decisions.  They wil also use the information to make decisions regarding office space, student/teacher workspace, and will have input as to how the over-all program is deliver at our local school.

2. DLT – DLT’s will use the recommendations to keep or change current practices.  This information will help in their day-to-day interactions with teachers, set-up Teacher Assistant schedules, student and classroom timetabling, and to stream-line and make their work more efficient.

3. The DLT Consultants and the District – This group will use the information in the report to see how things are working at our school – to gather a snapshot of information.  The results in the report may be shared with other schools via DLT consultants to make improvements and enhancements both at the local and district level.

What questions will the evaluation seek to answer?

1. Is the current set-up and DLT program working?

2. Are the DLT’s properly scheduled?

3. Do the DLT’s feel they have the proper resources to do their jobs as expected?

4. What improvements, if any, can be made to the existing model of DLT?

5. Do students feel supported in their learning?

6. Do parents feel their children (especially those with special needs or ESL) feel supported by the school?

7.  What recommendations can be made for next year?

What information do you need to answer the questions? What will I need?

  • District
    •  Initiatives mandated from the District
    • Directives coming down from central office
    •  Best-practices and current research in UDL
    • What other DLT models are operating at schools throughout the district
    • Staffing (the number of DLT’s assigned to our school)
  • School Administration
    • Scheduling/Time-tabling
    • Space allocations
    • Support staffing allocations
    • Their vision and understanding of the DLT program
    • Supports in place for the classroom teacher and professional development in UDL
  • DLT
    • Current time-tables
    • A list of current duties
    • Minutes from meetings
    • Schedules
    • Anecdotal notes
    • Student records (SRT’s, IPP’s, SAP’s)
  • Teachers
    • Support schedules
    • Anecdotal notes
    • Teacher plans
    • An assessment of technology skills
  • Parents
    • Informal discussions
    • Feedback forms (parent forms from student program plans)
  • Students
    • Informal discussion
    • Student records
    • Anecdotal notes from teachers and DLT’s

When is the evaluation needed?

The evaluation needs to be completed (including a recommendation report) by the end of April.  Administration and DLT’s will need the information so that proper planning, staffing, and accommodations can be made for the 2012-2013 school year.

What evaluation design will you use?

The evaluation will be formative.  This is the first year of the DLT program and it is scheduled to continue for the foreseeable future.  As this is the rookie year for the program, a proper evaluation will help guide the DLT program for the 2012-2013 school year.

The model that I will use is Sufflebeam’s CIPP model.  This is an ongoing process and because it continues for the next school year, the program needs to be evaluated and recommendations need to be made.  The following questions will be used to guide the evaluation.

“What needs to be done?”

“How should it be done?”

“Is it being done?”

“Did it succeed?”

Collect the Information

What sources of information will you use?

  • Existing Information
    • Previous Resource/ESL models
    • Current DLT models at other schools
    • Resources put out by Alberta Education on UDL and Differentiated Instruction (DI).
  • People
    • DLT’s
    • Administration
    • Teachers
    • Students
    • Parents
  • Pictorial records and Observations
    • Observations of DLT and student interactions
    • Sit in on meetings between DLTs and administration
    • Possibly attend a Day 6 Matrix meeting of DLT Coordinating Teachers (DLT-CTs)
    • Observations of DLT / Teacher planning.

What data collection methods will you use?

  • Surveys
  • Interviews (both informal and formal (video recorded))
  • Observations
  • Photos
  • Document Review
  • Journal/Reflection

Instrumentation: What is needed to record the information?


  1. Staff surveys
    1. Teachers
    2. Support staff
  2. Student survey (general student population)
  3. Parent survey (students directly impacted by the DLT program).

Interviews with:

  1. Administration
  2. DLT’s
  3. Students directly impacted by the DLT program

Observations/Photos of:

  1. Student performance
  2. Classroom operations
  3. Teachers teaching
  4. Teacher/DLT interactions
  5. Student/DLT interactions

Document Review:

  1. Individualized Program Plans (IPPs)
  2. English as a Second Language folders (SAP’s)
  3. Student Resource Team meeting notes (SRT’s) and SRT forms
  4. District job descriptions for DLTs and DLT-CT’s.


  1. DLT entries
  2. Student entries
  3. Teacher entries
  4. Administration entries

When will you collect data for each method you’ve chosen?

  • Surveys
    • Start: Mid-March  End: End of March
  • Interviews
    • Mid-March
  • Observations
    • On-going  – Beginning of March to end of March
  • Document Review
    • Beginning of March to Mid-March
  • Journal Entries/Reflections
    • End of March – Beginning of April

Will a sample be used?

A sample will not be used.  The program is systemic and district wide.  It will be impossible to find a control group.  The assessment tools will be constructed to evaluate the entire population they are suited for (ex. The student survey will be made available to all students in the school).

Pilot testing: when, where, how?

There will be no pilot testing.  All assessments will take place in the general population.  The assessment tools will be designed specifically for our DLT program, however, certain items (such as survey questions, or interview questions) can be dropped so that the assessment tools can be used by other schools to evaluate their DLT programs.

Analyze and Interpret

How will the data be analysed?

  • Surveys  – Results will be tabulated.  Data will be synthesized into graphs and examined for trends.
  • Interviews – Interview responses will be collated into a report.  Trends will be monitored.
  • Observations – Notes will be taken during observations.  This data will be collated into a report and cross-referenced with interviews.  Trends will be monitored.
  • Document review – Documents will be organized, read, and summarized.  These documents will be compared and contrasted with the interview and observation results to find commonalities and discrepancies.
  • Journal Entries – the evaluator will review entries and reflections. The evaluator will look for commonalities and discrepancies between the journal entries, observations, and survey data.  Data retrieved from the Journal entries will help guide the evaluator in making recommendations for future planning.

Who is responsible to analyze the data? 

Because of the sensitive and far-reaching nature of this program, only the evaluator will analyze the data and make recommendations.

How will the information be interpreted?

The information will be interpreted by the evaluator.  Other program evaluators may be asked to help review the data and the final report.


Use the Information

How will the evaluation be communicated and shared?

The information will be presented in 2 phases.  Phase 1 will be presenting the information to the DLTs and Administration.  The report will be presented in a private room with these two groups in late-April (when the report is complete).  The presentation will display data from the surveys.  Results from the individual responses will be presented as quotations.

Phase 2 will involve presenting the results (data and anecdotal notes) to the general teaching staff.  This will be done at a staff meeting (end-April to mid-May).


Manage the Evaluation

Human Subject Protection

All data being collected is being completed by district staff and the evaluation is being completed for district staff.  Since all information is being kept internal, no release forms will be necessary.  Should there be a need to release the information beyond the confines of the Calgary Catholic School District, release forms will be completed.


  • February – Complete the Planning Program Evaluation Worksheet
  • Late-February – Develop Surveys and Interview questions
  • Early-March – Administer surveys, conduct interviews
  • Late-March – Conclude and tabulate surveys, synthesize interviews
  • Late-March – Early-April – Complete report
  • Early-April – Share evaluation.


Data collection, analysis, and the final report are the responsibility of the evaluator.  Students surveys may be administered by homeroom teachers, and staff and parent surveys can be administered by school administration.


As this is a school based evaluation being completed by school personnel, no budget is being provided for the evaluation.




This program evaluation can have far-reaching impact.  The DLT program is new in our district this year and seeing “What we’re doing right” and “What we can do better” can provide strong guidance to the staff at my school and result in meaningful impact in our students’ academics and well-being.


This is a large undertaking and may require more than one evaluator.  As further materials are developed (surveys and interview questions) it may become more evident that the evaluation may need to take a narrow focus.


All materials being developed, evaluation, and results will remain property of the Calgary Catholic School District.


Assessment materials will be looked over not only by the evaluator but also by a fellow program evaluator for the “sober second thought”.  These reviews will be completed before the assessment materials are administered to staff.  This will help minimize any evaluator bias.


How Smug is your Mug??

My last blog post for ETAD874 addressed the generation gap and technology.  I decided to take a different twist on this post and give and in-depth review of one of the on-line gallery tools my team is looking at to present to our client – The Saskatchewan History and Folklore Society.

From an initial consultation and needs assessment here is what the client would like:

  • The online publishing of the over 9300 Baker Slides.
  • To have these images organized and tagged (and if possible geotagged).
  • To have these images available for purchase.
  • To protect the integrity and authenticity of the images (possible watermarking of viewable images).
  • An interface that is easy to use and maintain.
  • An interface that is easy to access by the public.
  • An interface that is accessible or integrated with the overall web redesign that is taking place.

Our team came up with 3 possible ideas for the Baker Slides:

  1. Using FileMaker and constructing our own database of the slides.
  2. Using Flickr.
  3. Using SmugMug.

I had never heard of until it was suggested by a fellow team member.  Another team member and I took on the task of developing a Sumgmug prototype.  Developing the prototype required some investigation into the workings of SmugMug.  Here are some of the findings.

SmugMug offers three levels for customers.  This blog entry will be reviewing the “Pro” version.  I will explain costing at the end.

Account set-up and Registration

Set-up and Registration was as easy as setting up an e-mail account.  The SmugMug main page offers multiple opportunities to sign-up for the free 14 day trial.  This trial is for the Pro features.  The initial sign up does not ask for personal information other than an e-mail address and setting up a password.  No personal information is required at this stage.  During setup, you are asked to set-up a domain within SmugMug.  It can be *anyword*  Users have the ability to go into advance settings and map their existing web-address to their sumgmug site. Set-up is quick and easy, and you are ready to upload your images.

Images and Galleries

After registration you create an initial gallery.  SmugMug allows for users to have multiple galleries (for ‘big picture’ categorization and sorting of images).  It also allows users to assign individual photos to more than one gallery.  Gallery pages are organized into thumbnail images on the left hand side of the window (with right and left arrows to see more photos) with a larger image preview on the right hand side of the window.  Clicking on the thumbnail changes the image preview. Floating the mouse over the preview image brings up image options.  Users can click “thumbs up or thumbs down”, download the image, and get more information about the image (size and data on the image).  Beside the preview are options to make purchases.

Uploading images is quick and easy as users can select multiple files from a folder on their computer and batch upload into SmugMug.  Users can then click the thumbnails to see the preview of the photo, and from the preview users can geotag the photo, tag the photo with keywords (which then makes them searchable in an one-stop search bar), and add a caption to the photo.

On the public side of sumgmug, viewers of images have the opportunity to share the images via Twitter and facebook.  They also have the opportunity to comment on the image or on an entire gallery.  All comments can be moderated by the managing account.


SmugMug Pro offers an e-commerce solution for the professional photographer who would like to sell their images.  Products offered range from common 4×6 photographs (with sizes increasing to the limit of the resolution of the uploaded file), mugs, posters, calendars, framed photos, wallet-size, and photo stickers.  All items are given a base price (price that SmugMug charges).  Users can then apply a mark-up percentage to the entire catalogue of items available or can take the time to set mark-up prices on each category or individual item available for purchase.  All items in the catalogue are produced on-demand so there are no overhead costs to inventory production.  Item check-out is similar to checking-out at any e-commerce site.  Customer service is handled by SmugMug, so users do not have to deal with any customer issues.

One of the negative components to SmugMug is that it is American based.  All transactions take place in $USD.  For the client, they need to set up a pay-pal account to receive their sales proceeds.  Canadians are charged 1.5x the U.S. shipping rate, which is discounted from other international shipping rates.  Base US shipping rates are quite reasonable.

Customization and Security

SmugMug offers users the ability to switch certain features on or off depending on how they would like to handle customization and security.  SmugMug allows users to:

  • Watermark their photos.
  • Disable the ‘right-click’ Save photo option.
  • Password protect certain photos/galleries.
  • Remove SmugMug headers and replace them with their own logos.
  • Moderate comments from the public.
  • Have a website free of ads and spam.
  • Enable viewing on iPhones, iPads, and other mobile devices.
  • Embed a flash slideshow of their photos in blogs.
  • Upload an unlimited number of photos (max file size of 24MB’s a photo).
  •  Have a dynamic display of the most popular photos.
  • Choose from (and change at any time) 50 different themes.
  • Have full HTML control over their website.

Maintenance and Cost

Once the images are uploaded, organized, tagged, geotagged, and captioned appropriately, there is very little maintenance that needs to be done to the site.  Users can come in and change a theme with a few clicks to refresh the look of the site without major time and work.  Users can go into the site at any time and change galleries or edit photo information.

The costs for SmugMug are as follows.  Basic – $5.00/month or $40.00/year, Power – $8.00/month or $60/year, and Pro $20/month or $150/year.  Here is a link to a comparison chart of the different price points.


Taking into consideration information provided from my fellow group members on the FileMaker and Flickr prototypes, I feel that SmugMug is the most appropriate choice for our client.  The features in SmugMug seem to check-off many of their ‘must-haves’ and most of the ‘would like to’s’’. The pro-version may be more than a semi-professional or hobby photographer may need, but the Baker Slides are a priceless historical collection of esteemed photos.  The money and time spent on developing proper galleries, which in turn the client would be able to keep control and security over, is important.  The e-commerce options also allow the selling of digital images as well as print photos (and other merchandise mentioned above).  Since the e-commerce is built into SmugMug and handled by their team, SHFS should be able to sit back and let the pictures sell themselves.  It could be as simple as waiting for the ‘cheque in the mail’.  As the SHFS is a not-for-profit, hiring or contracting work after the instructional design team has finished it’s work may not be in their budget.  SmugMug allows for an easy hand-over of the account settings and will require very little maintenance and upkeep.


Sufflebeam’s CIPP Model

The model that I would choose to evaluate the Prenatal Exercise Program for Aboriginal Women is Sufflebeam’s CIPP (Context, Input, Process, Product) model.  There are several reasons for choosing this method.

Context – “What needs to be done?”

The context of this study is the pre and post natal health of Aboriginal Women in Saskatoon.  The study was funded by the National Health Research and Development program and is not-for-profit.  The goal of the program is two-fold: to educate Aboriginal women in Gestational Diabetes Mellitus (GDM) and to improve the physical health of these women.  The authors of the program thought this to be a worthwhile program as the Aboriginal population is growing across Canada, and many Canadian cities would benefit from similar programming.

Sufflebeam’s CIPP model of evaluation lends itself well to non-profit organizations, community development, community-based youth programs and community foundations (Sufflebean, 2007).  This program fits the above criteria quite well.

The main question behind context is “What needs to be done?”.  The answer is that the education and well-being of Aboriginal women is the goal of this program.  Having a clearly defined outcome is one reason for choosing Sufflebeam’s model.

Input – “How should it be done?”

“How should it be done?”  This is the question to be asked when evaluating the ‘Input’ of the project.  This program has clearly defined steps and stages.

  • The program was conducted in Saskatchewan’s largest urban centre.
  • The program was extended beyond the existing Aboriginal GDM population to all pregnant Aboriginal women.
  • The program reached 7% of the eligible population.
  • The program was offered free of charge.

Process – “Is it being done?

The Klomp, Dyck, & Sheppard article provides a potential evaluator with strong, clear evidence of the path of the project.  They provide data on:

  • Cost for the user – The program is free of charge.
  • Date and time of the program – The program is on Wednesday afternoons.
  • Evidence that their program ideas are rooted in research – The program is based on guidelines from the American College of Obstetricians and Gynecologists.
  • Specific descriptions and data on a typical session (warm-up, aerobic activity, cool down) is provided.
  • Data and information is provided on the personalization of the program – self-monitoring and self-pacing.
  • The look and cadence of a typical class is explained – types and variety of exercises available (walking, water aerobics, select machines, etc).
  • Extra services provided to the client are described – Pre-natal consultations, pamphlets, access to other healthcare professionals.

Product – “Did it Succeed?

Although the article does not offer a concise summary, there is evidence throughout the article which would permit the evaluator to answer this question and evaluate the “end-product” of this program.

Here is some of the information I found that helps to answer the question “Did it succeed?”

  • The authors cite that due to popular demand water aerobics classes were moved to every-other week.  This is evidence that there was strong demand for this criterion of evaluation.
  • Participants in the program started to bring a friend (for drop in and moral support).
  • The program allowed for a flexible drop-in format to meet the needs of its participants.
  • Women were encouraged to attend after child-birth.
  • Participants were telephoned a day before each session as a reminder.

These four questions, along with Sufflebeam’s CIPP EVALUATION MODEL CHECKLIST are intended to evaluate the longevity of a program.  This is a program that is intended to have long-lasting effects on its participants – and the broader community.  The GDM health initiatives go beyond pre-post natal health, to the general health and well-being of Aboriginal women.  As stated earlier in this piece, the Aboriginal population is growing.  Since this was a two-year project (1995-1997), the data produce from an evaluation would be summative.  This end-data would will be useful for future programs in Saskatoon, and other Canadian cities wishing to develop similar programs.

Although I found Sufflebeam’s CIPP model the best fit for a program evaluation, there are some aspects of this program which may not be best-suited for this model.  Information regarding the cost and sustainability of funding were not addressed in this report which is one of the areas where a CIPP model will focus its evaluation.  Also, the CIPP model evaluation questions the needs of the end-user and if those needs have been met.  The report includes some facts and observation, but no anecdotal evidence (interviews and personal interviews of clients).  Sufflebeam’s evaluation method also looks into the “lessons learned” from the program.  Although strong data was provided on the process of the program, there was very little concluding data or evidence.  This would make a proper CIPP evaluation difficult to complete.

The main goal of a CIPP evaluation model is “not to prove, but to improve”. (Sufflebeam, 2007).  An evaluation needs to consider a programs merit, it’s worth, its probity, and lessons learned.  Compared to other evaluation methods that I investigated such as Stake’s Countenance model, and Rippey’s Transactional Model, Sufflebeam’s CIPP model is the most appropriate based on the data provided in this report.  The report showed the merit of the program (by providing data and a vision of how the project wanted to impact the lives of aboriginal women), the worth of the program (by providing evidence that the women involved were able to make small improvements and better choices in their lives and that these women gained access to health care professionals and resource supports), the probity of the program (by showing that what the program was doing was rooted in best-practice and research),  and that there were lessons learned (based on evidence provided in the body of the report even though a complete conclusive summary was not provided).

Klomp, J., Dyck, R., and Sheppard, S. (2003). Description and evaluation of a prenatal exercise program for urban Aboriginal women.  Canadian Journal of Diabetes, 27: 231-238

Sufflebeam, D. (2007) CIPP Evaluation Model Checklist [Second Edition]: A tool for applying the CIPP Model  to assess long-term enterprise. Retrieved January, 17, 2012 from:

The Generation Gap

Retrieved from:

One of the assignments for ETAD 874 (Advanced Instructional Design) is to find and review articles that relate to the overall design project for the course.  Let me give you some background on the design project.  The project that the 12 of us is working on is a redesign and re-organization of the Saskatchewan History and Folklore Society.  The class is divided into two teams.  One team is working on the redesign of their website, whilst the other group is working on the organization and distribution of the Baker Slides (a historical image collection).  I am in the second group.

During our first project team meeting, we discussed the initial needs assessment of the client and who their target audience is.  From the information provided to us we discovered that it is mostly seniors who access the website and image database.  The discovery put us at a crossroads.  Do we revamp the website for the ‘next generation’,  do we refresh what is already in place, or is there a way to accomplish both – the win-win situation?

This discussion led me to an article on Instructional Design and the generation gap. Although the article is directed towards higher education, there are strong ties to the website redesign and the impact it has on the age of it’s users.

The article was taken from the Canadian Journal of Learning and Technology and is titled “Digital Learners in Higher Education: Generation is Not the Issue”.  It was written by Mark Bullen – British Columbia Institute of Technology, Tannis Morgan – Justice Institute of British Columbia, and Adnan Qayyum – University of Ottawa.

The article is a reaction to the popular belief amongst many designers that instructional design and technology use is generation-dependent.  The authors make a bold statement in the abstract that says: “A comprehensive review of the research and popular literature on the topic and an empirical study at one postsecondary institution in Canada suggest there are no meaningful generational differences in how leaners say they use ICT’s (information and communication technologies) or their perceived behavioural characteristics.”  (Bullen, Morgan, & Qayyum , 2011)

The authors spend the first few pages of their article arguing against the common belief that the net-generation (which they define as people born in 1982 or later) is better suited for ICT’s in their instruction.  The authors argue that ICT use is not generation dependent – it is USE dependent.  I will discuss this issue more later in this post.  Bullen, Morgan and Qayyum found many studies that support this wideheld belief, however, they had some issues with the research.  In many cases, research was completed on students (multi age) who were already enrolled in technology programs.  They also found a study (Oblinger and Oblinger, 2005) which legitimized the unique learning patterns of different generations, but the report was based mostly on speculation and anecdotal research. (pg. 5)

Bullen, Morgan &Qayyum decided to author their own study which looked at two groups: the Net-Generation (b. 1982 and later), and the non-net generation (b. prior to 1982).  The study was conducted at two British Columbia postsecondary institutions and took place in two parts – Interviews and an empirical data survey.  The study aimed to answer these three questions:

“How accurate are some of the more prevalent claims about net generation learners? Do the students at this Canadian postsecondary institution fit the typical profile of the net generation learner? How are the learners at this institution using various information and communication technologies (ICT)” (P. 6)

Here is a summary of their findings:

1)      Contrary to popular belief that e-mail is for the ‘older generation’, the study found that both groups use e-mail because of its formality and to maintain a certain distance from their professor.  E-mail was also a useful tool when needing to communicate to a group, long messages, and to share files.

2)      When asked what factors could improve their learning, both sets of students deferred answers on ICT’s and cited more physical factors such as better lighting, better lab and library hours, more windows, and better internet access.

3)      There were no significant differences between the generations in computer use, their desire to explore learning, their preference for clear instructions before trying something new, and goal setting.

4)      There were no significant pattern differences in personal or institutional e-mail use.  For both groups the most preferred method of communication with peers was in-person.  In fact, both groups had similar rates of in-person communication vs ICT communication with peers.

5)      The study found that the net-generation was more inclined to use web tools (instant messaging, facebook, etc) to communicate between peers , but this was not the case in communicating with the instructor in a course.

The authors argue that the use of ICT’s are not driven by generation, but driven by the context of the course materials.  My interpretation of their argument is that the use of webtools for learning is driven by the context and the need for the technology not by perceived age needs (which reminds me back to a previous blog post I did on Technology for Technology’s Sake).  They say “..we need to avoid the temptation to base our decisions on generational stereotypes and instead seek a deeper understanding of how students are using technology and what role it plays in learning and teaching in higher education.” (p. 17).

For Bullen, Morgan &Qayyum, they feel that the report has two main findings.  Instructional design decisions should not solely be based on generation and age alone, and that when institutions fund future ICT investments, they should avoid blanket-programming (such as system wide webtool licences) and look at the specific needs of individual programs and fund their needs appropriately.  They say that institutions should avoid making campus wide ICT decisions as they may not be appropriate for all programs (p. 18).

Although this article dealt with higher education, the idea of generation has be discussed in our round-table group meetings more than once and has become a “hot button” issue for us.  It is something that as a group we will need to discuss further before proceeding with our design.  I hope that this article review sheds some light on how different generations interact with technology.  One of my guiding design principles is that good design will transcend the target audience.  I plan on brining this idea forward in our design team meetings.



Bullen, M, Morgan, T., & Qayyum A. (2011) Digital Learners in Higher Education: Generation is Not the Issue. Canadian Journal of Learning and Technology, Spring 2011.  Retrieved from:



Program Evaluation Review

Here we are in 2012 and I have a new batch of courses that I’m taking.  I just finished up Instructional Design and Designing for Distance Education.  My next two courses are Program Evaluation and Advanced Instructional Desgin.  Completeion of these two courses will put me over the 1/2 way mark in my Master’s program.  That’s hard to believe!  This first blog entry (in a LONG time) is a program evaluation review.  Over the coming months, I’ll be posting assignments here on my blog.  I encourage comments!


Review of “The Brandon Middle Years Pilot Project Program Evaluation” (BMYPPPE)

The BMYPPPE was completed by David Patton, Ph D. and Jackie Lemaire B. Sc in August of 2002.  It is an evaluation of a Pilot Program taking place in two Brandon Junior High Schools whose goal it was to increase the knowledge of grade 7 and 8 students in alcohol and drug use, and to see if increased knowledge would change students’ decision making skills.  I found the report after a Google search of “middle years” and “program evaluation”.   Since middle years are my area of interest, I wanted to find a report that dealt with that target group.

What I liked about the Report:

The report was well organized and properly formatted.  It was official and professional.  It followed the general program evaluation format of the title page, table of contents, introduction, research results, summary, and appendices.  There were no grammatical errors and the findings were reported at an understandable language level.

Findings were presented in graphs.  The findings of each evaluated criteria were presented as text first, with a graph to unify the data.  The graphs were clear of any superfluous data.  They were well sized (they did not take up an entire page) and communicated the findings clearly.

The data was analyzed and compartmentalized according to specific criteria.  The program evaluation took into account the differences in:

  •         The two middle schools observed
  •          Gender differences
  •          Difference between Grade 7 and Grade 8 students
  •         Differences (in use and opinion) between alcohol and drug use


I liked how the program was evaluated using different question techniques.

  •         Questions on access to drugs and alcohol were asked using a Likert Scale.
  •         Questions on frequency of drug and alcohol use were asked using a Likert Scale.
  •         Students were also asked short answer and open ended questions when it came to questions about personal feelings.  I like this technique as it allows the respondent to speak as an individual and there are no ‘guiding’ questions or prompts.
  •         Questions regarding specific knowledge pertaining to the knowledge from the Pilot Project (such as the addictiveness of cigarettes, or the legality of certain drugs) were asked using True/False.  I liked how they also had a third option of “I don’t know” to allow for students to be honest in their knowledge.  This is something that I will take with me when I conduct my own program evaluation.

Here is what I did not like about the report:

I found that the report did not address ethnic diversity – cultural backgrounds or the socio-economic backgrounds of the students.  This could have been addressed in the initial introduction where a ‘snapshot’ of the two schools involved in the survey could have been given.  I was left asking questions such as, “Have more schools taken place in this pilot project?”, “Why were these two schools chosen?”, “Is there an existing drugs and alcohol problem which this pilot project was designed to address?”, and “Is there a senior high follow up to this pilot project?”.

Although I like the use of the Likert Scale, I was left wondering if questions regarding attitudes (and changing attitudes) would have been better addressed in a short answer format.  This led me to another question/idea of wondering if there was room in the evaluation to have completed some pre and post interviews with students.  This may not have been a possibility because of the sensitive nature of subject matter and the possibility exists of unauthentic responses in an in-person interview at this age.


Here is a link to the Program Evaluation

A whole new way of Social Networking

(My lineage)

So I’ve done something that has taken social networking to the next level.

How would you feel about being ‘friended’ by someone you have lots in common with, but is a complete stranger? You’ve never met this person, but you are connected to them.

I’m talking about social networking based on your genes and DNA.

If you follow me on twitter, you can probably recall that I sent off a sample of my saliva to have my DNA analyzed.  I sent it to a company called 23 And Me (  They were first featured on Oprah a few years back and more recently on Anderson (Anderson Cooper’s new daytime talk show).  They product is rooted in science.  Here’s how it works.

You sign-up for an account and  pay them money.

They send you a kit.

You spit into the kit, shake it, seal it, and send it off to their lab in California.

4-6 weeks later you get an analysis of your genome.

The results are completely on-line.  You recieve an e-mail saying that your results are ready and to log in to find out.  The findings are grouped into 2 distinct areas each with their own sub-groups.

1. Health (Disease Risks, Drug Interactions, Health Labs, to name a few).

2. Ancestry (Maternal and Paternal lineage, and golbal ancestry).

I knew that I would find out this information (as I had done my due diligance and research before signing up), what happened next was a surprise to me.  When I logged in (one day after I had received the e-mail) I already had a request in my account to make a new connection…..with a 5th cousin!  (It turns out that there are 247 other people on 23 and Me with whom I am related to (mostly 3rd to 6th cousins)).  This brings social networking to a whole new level.

I’ve never met this person…nor have I ever heard of this person.  Is there more of an obligation to accept this friend request and start sharing my DNA information with him?  They always say “You can’t pick your family!”.  It turns out that we’re 5th cousins…which means we share Great-Great-Great-Great-Grandparents.  Whoa!  I’ve probably passed by people in the mall that I could be closer related to.  What about privacy? Just because we’re ‘family’ do I open up and share everything?

How long will it be before either 23 and Me expands their social networking to be more like facebook, or what if facebook decides that 23 and Me would make a good purchase?  Social Networking has already made it easy for us to share the details of our lives.  They’ve also made us comfortable with it.  I know that personally I’ve become more ‘comfortable’ and feel ‘safer’ on the internet.  How long will it be until even more of us is shared with friends…family….strangers?


Technology for the Sake of Technology

Image Source:

I’ve been back at school for about two weeks now.  It’s been great to get back in the building.  It’s where I belong and it’s what I love to do.  Writing blog entries and starting my master’s degree in Educational Technology and Design have really opened my mind to thinking about things in different ways and from different angles.  It’s meant a whole new way of approaching and solving problems.

One idea that has stuck with me from my summer course was “What exactly is technology and how do we use it?”.  Let me clarify.  One chat that happened over twitter and in our course discussion thread was that hair colour is a technology.  If we look at technology as being cables, cords, wires and processors, then hair colour does not fit.  If we use a broader definition that technology is a solution to a problem, then hair colour is technology.

So that brings me to this morning when a colleague asked me to go look through a PowerPoint presentation that he was working on to present to his class.  The topic was The Legislative and Judicial Branches of the Canadian Government (Social Studies 9).  His classroom is very well outfitted.  He has a tablet laptop, projector, sound system and a mounted SMART board.  He began by apologizing to me for not using SMART Notebook software.  The apology came because in my school I have been working with staff on improving their SMART board skills.  I told him that he didn’t need to apologize.  From my reading/thinking/processing over the summer, one of the tuths that I’ve come to believe is that technology is most useful when it can meet a need.  But it’s not only meeting a need thats important, but  meeting a need in an efficient and useful manner.

This teacher has interactivity in the classroom.  His students are engaged in conversation, group work, group research, and collaborative learning.  PowerPoint has gotten a bad rap for being a ‘boring’ means of presentation.  There are many other tools one could use (SMART Notebook and Prezi come to mind). However,  is it the teacher or the technology that is going to drive this lesson?  His slides had humour, great infographics, and fantastic images.  There was very little text, which means that it was not PowerPoint that was going to drive the lesson, but a real-life teacher.

I’ve come to a conclusion that using technology for the sake of using technology is not always best practice.  Don’t we have times where reaching for a pad and pen is quicker than opening an application on a laptop to make a reminder or jot a phone number?  Does fancier/flashier software make for a better lesson or teacher?  Aren’t there times when using a whiteboard marker to solve an equation is easier than starting up a computer and using an interactive white board?  What about phoning parents vs e-mail?  Are there times when a phone conversation is more useful than getting into a misunderstood e-mail battle?

In a confusing manner, this blog entry seems to be ‘anti-technology’, but that is far from the truth.  I love technology – I use it all the time in teaching and in my personal life.  I love exploring new web tools and applications and seeing how they can be used to improve the status-quo.  My point is this.  Technology for technology’s sake is not always best practice.  It’s important that we choose the right tool for the right job and make sure that we use our ‘human’ talents and efficiencies in harmony with technology.


Removing the Barriers

In June 2010 Alberta Education released the document Action on Inclusion.  This was a metamorphosis of ‘Setting the Direction’ which was taking a look at special needs education in the province.  What the department found was that good teaching practice is good teaching practice.  If there is a modification, technology, or assistance that can be given to one student, can that not be best practice for all students?

The document identifies that some students will need long term support, but also that some students may only need short term support.  To help us through understanding Action on Inclusion, Kathy Howery from the University of Alberta came and gave a presentation to both the Elementary and Junior High AISI cohorts.   She is an inspiring person and is passionate about Universal Design for Learning (UDL).

Kathy said that many of the technologies we use today, were born out of assistive technologies.  Think about predictive text on our mobile devices.  Predictive text was initially born as an assistive technology and has become common place.  What about sidewalk ramps?  They were initially created as an access for wheel chairs but have gone much further.  People with baby carriages, wagons, small children on bikes, etc use them.  Think about closed-captioning on television.  Initially created for the hearing impaired, closed captioning has become used at gyms, and probably has saved many marriages with televisions in the bedroom.  Spell check is again something that was created for the margins, but has been adopted by the mainstream.  She also makes the argument that some of these tools have become so commonplace, that to remove them would seem unnatural like removing someone’s eyeglasses.

UDL is based on research ( and is based on three principles.

1)   The ‘What’ of Learning – Present information in different ways – Multiple means of representation

2)   The ‘How’ of Learning – Differentiate the ways students express what they know

3)   The ‘Why’ of Learning – Provide multiple means of engagement.

In her presentation Kathy used an example of a novel study for high school English.  I believe the novel was Huckleberry Finn.

So we have the traditional model of Teacher Assigns Novel –> Students Read Novel –>Students synthesize Novel (Formal Assessment).

UDL would take a different spin on this.  Depending on the individual’s need, ‘reading’ the novel could be: listening to an audio book on an iPod, using a ‘coles notes’ version, watching the movie, reading a synopsis on Wikipedia, or using 60secondrecap just to name a few.  Then students may choose a different way to show their work. They could do a voicethread, make a video,  a dramatization,  a photostory, a concept map, a dance, a traditional report or even a blog!

What UDL recognizes is that we as teachers need to take a proactive and inclusive approach to the diverse student population that is before us.  We cannot continue to teach to that elusive ‘middle ground’ one-size-fits all student, because they do not exist.  We need to create pathways and remove barriers to student learning.  Students need choice, options and flexibility in their schooling.

A real obstacle facing UDL (and Action on Inclusion) is curriculum, more specifically the current Program of Studies.  The current POS has a narrow focus and constructs barriers.  As a result of Action on Inclusion, the province of Alberta has embarked on the next phase, Action on Curriculum.  AOC is seeking to completely re-think curriculum and its delivery in Alberta.   Action on Curriculum has the power to completely transform our current model and what we think about ‘school’.  Click this link and watch the short video.  I found it inspiring and forward thinking – we may just do what’s best for kids yet!

Digital Agendas

Image source:

This fall marks the beginning of the end of our paper agendas for Junior High Students in our district.  Our current agendas are probably similar to what others have: a student handbook at the beginning covering school policies, the September to June Calendars for students to write in,  and a reference section at the back with math formulae, periodic table,  the map of Canada, etc.

In September a limited number of agendas will be available for purchase at the main office – they will not be handed out to every student on the first day of school.  The limited agendas will not include the section of school policies and procedures. The section on school policies and procedures will now be published to our school website and D2L homepage.   In 2012-2013 no paper agendas will be ordered.  Click here to link to my district’s press release.

How is this going to work?  D2L has the ability to take events from different courses and populate those items in a single  homepage calendar for each student.  For example, if I make an event for a math project, I can put it in the course calendar, and it will be pushed down to the individual calendar of the students who are enrolled in the course. D2L has a homepage widget called ‘Events’ which will organize what is coming up ‘Today’, ‘Tomorrow’, and ‘This week’.  This event box is front and centre when students log on to D2L.  The user’s individual calendar is a hybrid of course events pushed down from instructors and user-generated events.

Moving to digital agendas has some positives and some areas of concern.


1)   Digital agendas are never lost.  Lost agendas have been a real problem. D2L offers a “one stop shop” for events, content, and grades.

2)   Digital agendas are easily accessible D2L offers a traditional computer log-in as well as a mobile portal.  Students can access their calendar on their mobile devices and personal (or school) computers.

3)   Calendar events can be directly linked to course content.  For example, if a student has a science exam on Wednesday, that event can be linked to multiple sources of content – study sheets, youtube videos, practice questions, etc

4)   Parents will have consistent access.  No longer will it be the scenario of “Show me your agenda”, but parents have the ability to log-into their child’s account and track their child’s events.


1)  With whom does responsibility lie?  With traditional agendas it is up to the student to record their appointments thus the majority of the responsibility lies with the student.  With digital agendas it is the instructor who will add events to the calendar.  Does this take away responsibility for student time management? Can it become a shared responsibility? How do we teach and model time management?

2)   Teachers must consistently use the calendar and events tools.  This relates to the above point.  Does a teacher creating the event take away from the student’s responsibility for his or her own learning?  That is a concern for my staff. Also, what if a teacher is not “pulling their weight”?  What expectations are placed on teachers to input their events?

3)   Digitala agendas cannot be used as a behavioral tool.  In the past paper agendas were used for parent communication, hall passes, bathroom monitoring, etc.  We will have to explore new ways of managing students as well as using our LMS for parent communication.

4)   Parents must be informed and guided through the transition.  Between school administration,  my AISI cohort, and myself a P.R. campaign will take place to inform and help parents transition to the digital agenda.

Have you been using digital agendas in your school?   I would love to hear your thoughts and feedback.


Gone But Not Forgotten

Three years ago I was digging through some dusty old photo albums at my Grandparent’s house. These old albums were windows to yesteryear…seeing my dad as a boy, and seeing my grandparents other than ‘old’.  One photo that struck me was of a beautiful mid-twenties Chinese woman.  She had medium length thick black hair, fair features and was dressed in a black dress with pearl earrings and necklace.  I had no idea who this was so I asked some questions.  It turns out it was my Bak-hoo (Great-grandmother), who at the time was still alive and approaching her 100th birthday.

This past February she passed away at 100 years of age.  What an incredible life!  Can you imagine what you would have seen in 100 years?  Death always takes us to a place of reflection and introspection.  Being two generations removed, and a language barrier, I never really got to know her.  Everything I do know was passed down from my grandparents or my dad.  I knew that she had courage, spunk, and sass.  I knew that she risked everything to leave mainland China to make a better life for her family; of which I am a part of.  However, I really never got to know HER. (Here is one of the few digital items left behind of my Great-grandparents (Chu)).

I caught a TED Talk by Adam Ostrow.  The topic was “After Your Final Status Update”. His talk is what inspired me to write this post.  Living in the media age, are we going to live in a shroud of mystery? I don’t think so.  A complete life profile will take shape by collating our digital ‘bread crumbs’.  Our lives will not be a series of facts passed down through stories, but a rich history and chronology of our time on Earth.  Our posts/tweets/updates are our voice; they are more than factual.  They include our tone, our position, and our emotions.

I can only imagine what Chu Yek-Seen would have tweeted 70 years ago. What would I be able to piece together today from her life if social media had been around?

This has significant implications for our life today and digital citizenship.

1)   What we do and say today will be around long after we’re gone.  Social media is creating a digital time capsules of our lives.  Every time we create a new profile for a website, tweet, blog, or post we are donating one more artifact to our digital archive.

2)   Digital personae are created before we arrive.  Parents are posting pictures of sonograms, names, and details of their children before they arrive.  I’ve even known someone who created a facebook page for their unborn child (they did an excellent job by the way).

3)   Last Will and Testament.  More than our estate each of us should think about how we want our digital persona to live on after we have gone.  In saying this, we may not have a choice in the matter.  All we can control will be how we would like our SM sites to be ‘shut down’.  The who/what/where/when of who will do it.  Do you we leave a final ‘blog’, ‘tweet’, or ‘status update’ from the great beyond?  Complete digital representations of us will most likely be created by software from a synthesis of our digital artifacts.  Will we speak beyond the grave?  Will our avatars live beyond us?

When we teach and model Digital Citizenship to our students, it is important that they understand both the positive and negative aspects of the digital content they create.  I’m not saying that we need to discuss death and mortality with students, but it is important that they realize, good or bad, what they’re saying and doing on-line is creating an archive of their lives.  A colleague of mine has a poster up in his classroom that says “Your grandma just saw the picture you posted on-line”.  I’d like to do a flip and say, “Your grand-kids are going to see that picture you posted on line.”