Saturday, March 3, 2012

Assignment 5


The survey was created for students from grades 6-12.  The Study Hall Program is in its third term.  Quantatative data will be collected from four terms of marks as well to accompany the survey.

I had eight people take part in my Focus Group.  I had four students from grades six, seven, ten, and twelve and two male and two female adults complete the survey and assist me in creating a useable product.  The group worked from a paper copy and when the final product is completed, it too will be given to the students in a paper copy as this will be the quickest and most efficient way to do it at the school level. 

This was a great experience for me.  Having a group of people look at your work and critisize is not always the most comfortable thing but when you are expecting it, it makes it a little easier to take.  It reminded me how in important in Instructional Design it is to have a product that everybody understands exactly what the author is asking.

1.  What grade are you in?

I figured that this would be easy enough but what I didn't think of was in out split classes the students would put down the teacher that was in charge.  I was looking for a more specific answer.  It was suggested that I give a selection for the grades. This question was changed so that a grade will be selected.

2. Are you male or female?

I received the most response from this question.  I had seen a survey that had similar participants use this question and included it with mine.  After receiving feedback, I came to a conclusion that this information is really not necessary for my survey.  In fact, it seemed kind of silly after I thought about it.  In fact one participant-“this is silly Mr. Wandler!”  It was eliminated in the final survey.

3. Do you find that your assignments are being completed on time with the help of Study Hall?

One of the responses sparked an idea for me that I should include this as a rating scale question.  This gave me an idea to write several of these questions.

4. Do you find that your assignments are being completed on time with the help of Study Hall?

The biggest critisism for this question was that is was a yes or no question.  I made this a range question as well.

5.  What are some things that prevent or hinder you from completing assignments?

This question was also changed to a range question from the feedback.  The suggestion was made to not use hinder as a descriptor.  I eliminated the question.

6.  Read each sentence below and put an x where you feel your response fits.

This was fine with the focus group.  They liked that they were able to choose all that applied.  I looked at the format and felt I would clean that up by making each question a rating scale. 

7. Tell us what is good about Study Hall.  What should stay the same?

This question was appreciated as it was a way to list the things in the Focus Group’s likes towards Study Hall.


8. How would you change Study Hall to make it better?
Same feedback as previous question.


The feedback I received from this survey was invaluable.  Not only did I gain a better understanding of what Study Hall meant to each person, but I was able to create additional questions derived from their responses.   Studying the Focus Group’s answers made it more clear to me as to what questions I wanted to use as rating scale questions.  This assignment was invaluable to me as I was able to see how important it is to get outside help when creating surveys such as this.

Friday, February 10, 2012

Assignment 4-Logic Model

Below is a link to my logic model for Study Hall.

http://boxify.me/logicmodel1

Assignment 3-Program Evaluation-Study Hall


Engage Stakeholders

Who should beinvolved?
Theteachers, administrators, EAs and students at Allan Composite School will beinvolved in the assessment.

How might they beengaged?
Staff,administration, EAs, students and  willreflect on the program through survey questions, interviews, and observations.  Students will be also be involved throughsummative evaluation through previous term’s grades to current ones andinterviews.

Focus on the Evaluation

What are you going toevaluate?  Describe program (logic model)
The plan is to evaluate the:
-useof resources-time, teacher and EAs
-procedures-how program is being implemented
                    -are assignments being handed in
-marks-impactof Study Hall on grades

Allan Composite School serves 178students from kindergarten to grade 12. The school is part of the Prairie Spirit School Division.  The Prairie Spirit School division created adocument in 2009 that outlined the direction it was taking with regards toassessment.  It created a plan forimplementation.  Starting in 2009 teachershave been educated and in-serviced about the division’s approach to thedocument. 
My admin partner and I created a blockof time, twenty-five minutes during the school day, every day where studentswould have time to complete assignments, study, etc.  We call this time Study Hall.  Students from grades six through twelvereturn to their homeroom teachers during this time to work on assignments
Study Hall is a time built into theday for students to be more successful at school.  My admin partner reworked the schedule,taking five minutes from each period to create a twenty-five minute mandatoryStudy Hall period.  It has beenstrategically placed into the daily schedule to prevent students from not attending.  Attendance is taken during this time. 
This is a period created from takinginstructional time from all subject areas: it is there to be used just forthat-instructional time.  Teachers areexpected to help guide students if they are not working on the appropriatethings during this time.  Some of thethings students are expected to work on, assignments, study for assignments,write tests, quizzes, learn how to study, etc.

What is the purpose of this evaluation
This is a formative and summativeassessment with the purpose of identifying the effectiveness of the program andto identify any weaknesses and make any adjustments to meet the student’s needsand make the program more effective.

Who will the use the evaluation?
The designers of the program will usethis evaluation to improve the program. Teachers, EAs, and administrators will use it to reflect on the feedbackgathered and work on solutions for improvement.
Who will use the evaluation? How will they use it?
Who/users
How will they use the information?
Administrators
Information from the evaluation will be used to improve the program. 
Staff
Results and conclusions from the program evaluation will be shared with the teachers.  The purpose is to show staff what is and isn’t working in Study Hall.
Students
Staff and administrators will be able to make changes to the program if necessary.
Parents
Results from the evaluation will be shared with parents and suggestions for improvement will be invited
School Community Council
Results from the evaluation will be shared with the SCC.  Feedback from them may result in adjustments to the program.



 What questions will the evaluation answer?


How does Study Hall impact students?
How does Study Hall impact Staff?
How do parents see the impact of study hall?
Are teachers guiding students during this time?
Are students’ marks improving?
Are assignments being handed in on time more often?
Are students finishing assignments?
Are resources being used to their potential?

What information do you need to answer the questions?
What I wish to know
Indicators – How will I know it?
Is the time allotted for Study Hall appropriate?
Students are more successful
Are student’s marks improving?
Data from previous year’
Are students using the time effectively?
Through observations
What changes are necessary to make the program more effective?
Implementation of new strategies are creating successes
 When is the evaluation needed?

Ideally the evaluation will be completed by May 1.  This will give the school enough time toimplement any adjustments before the end of the year.

What evaluation design will I use?

The evaluation will be formative innature.  Summative data from previousterms will be used to compare averages from previous terms when the program wasscheduled.  The program will be evaluatedand any modifications, tweaks or changes will be applied prior to the end ofthe year.  The evaluation will include acombination of the Provus-Discrepancy and Stake-Countenance models.

Collect the information

What sources of information will you use?
Existing information: Marks from 2009/2010 will be used as a partof a summative evaluation
People:Administration, teachers, Educational Assistants (EA), and students
Pictorial records and observations: Administrative observations, teacherobservations and EA observations.

What data collection method(s) will you use?

Survey                                    Documentreview
Interview                    Observation              
Unobtrusive measures  Term marks

Instrumentation: What is needed torecord the information?
Surveys (teachers, EAs,students, parents)
Interview (students,teachers, EAs)
Data collection sheet for:
-group/individualobservations
-student focus
Journal (includes possiblereflective questions)
This could be utilized byboth students and teachers
Focus would be on the typeof strategy teachers are using to help students be successful
When will you collect data for each method you’ve chosen?

Method
Before program
During program
Immediately after
Later
Surveys

X


Interviews

X

X
Data (marks)
X
X

X
Observation
X
X

X

Will a sample be used?
No

Analyze and Interpret

How will the data be analyzed and what methodswill be used?

Surveys
-Compiled and assessed for trends inparticular fields
Interviews
-Compiled and assessed for trends inparticular fields
Data
-Analyze marks from previous year
-Rubric assessed for commonalities
Group observations
-observe all grades during Study Hallperiod
-assess and analyze observation forcommonalities and differences
Journals
Read by evaluator look forcommonalities and memorable quotes-testimonials to program

How will the information be interpreted-bywhom?
Principal and evaluator will beresponsible for analyzing data.  Staffwill be brought into discussion and have time to engage in dialogue and allowfor their interpretations during report debrief.

What did you learn?  What are the limitations?
The evaluator is internal, so cautionneeds to exercised while interpreting the results.  As an administrator is creating the surveyquestions, participants will be responding to a colleague’s work.

Use the information

How will the evaluation be communicated andshared?
Commonalities will be shared with theparticipants along with the observational data.  The data will be shared in smallgroups to discuss and later brought to a big group discussion.

What are the next steps?
The large group will have anopportunity to reflect on the ideas brought to the group and then makerecommendations for any adjustments desired.

Manage the evaluation

Human subject’s protection
Surveys will be done anonymously aswill the journal entries by students.  Thiswill allow for participants to answer questions truthfully.  The student’s needs are what will beaddressed after looking at the results.

Management chart
The evaluator will manage the programevaluation with administrative partner prior to the staff meeting where thelarge group will be sharing the results and hearing about any ideas foradjustments.

Timeline
The survey, observations, interviewsand journal entries will be distributed and take place during the months ofMarch and April.  All data will beanalyzed and summarized with recommendations to happen by April 20, 2012.

Responsibilities
The evaluator will create all surveyswith the input from staff, distribute them and analyze the results, summarizefindings and coordinate alongside the principal discussions with staff andSCC.  Collaboratively, a list ofrecommendations will be compiled. Interviews with staff, students and parents will be conducted byevaluator who will be responsible for analyzing data and sharing it in the sameway as previously discussed.

Budget
All devices and materials required arelocated internally and therefore, no additional funding is necessary.

Standards

Utility
Evaluation of the program will be usedto make adjustments to the program to immediately affect how effective theprogram is.   

Feasibility
Due to the size of the school, thenumber of students and staff, evaluation will not be an onerous task. Theevaluator has the authority to use substitute teacher times to meet withteachers if need be.

Propriety
As the school is small, the staff andstudents have a very close relationship. This relationship is well respected and the design of the evaluation issuch that will honour this atmosphere.  Thestaff has already been included in the preliminary planning of this program.

Accuracy
The use of marks from previous terms willbe measured against those that are current. Staff is experienced and knowledgeable as to the needs of the studentsat this school.
 

Saturday, January 21, 2012

Assignment 2-GDM Study


             While reading this article, I was able to make some definite personal connections.  Close friends of mine have experienced this condition so I understand how important it is to identify and educate pregnant women of risks and methods of prevention.
There was so many things to look at in this study.  I was curious about how they got everyone to the facility.  I am thinking that transportation was paid for and it was included in the program.    I was also intrigued at how they supplied participants with materials to take home or read there.  For this study I chose to goals to focus on:
1)   First Nations women have higher rates of gestational diabetes mellitus (GDM) than non-First Nations Women.
2)   Exercise should be used as a primary prevention strategy for type 2 diabetes.
Looking at a study where exercise is being used to help prevent the onset of type 2 diabetes will benefit other Aboriginal women not only in Saskatchewan but those across Canada as well.
Stufflebeam’s CIPP Model is an effective tool that can be used in order to evaluate this program.   It is a very thorough model and very easy to follow.  The graphic organizer he uses to explain his model, mirrors that of Saskatchewan Curriculum by centralizing his model around the goals.    
            I like that the purpose of Stufflebeam’s model is not to “prove but improve” (P. 4) I do say that statement with a bit of caution as There is plenty of information provided to the evaluator for the context (C) of the program.   Researchers recorded excellent data (I) around what was occurring in the study.  While documenting this, the researchers would have information to measure the effectiveness what the participants were doing.   The process (P) was clearly outlined in the document as far as formatively evaluating the program.   For longitudinal data to occur, participants would have to be monitored for years after they finished the program (summative evaluation).   Stufflebeam’s model allows for a product (P) where improvements can be made during the program.
            I feel the information gathered during this study would be invaluable to not only Aboriginal women but all women of child bearing age regardless of race or ethnicity.  The Provus-Discrepancy Model would contribute in the delivery of this program to the public. Health officials would be made aware of effectiveness of the program and using this model will compare whether preventative care is a more cost effective treatment than to treat post childbirth issues.
            Using this two-model method one would establish if the program is beneficial to continue or what changes to make.   
           

Saturday, January 14, 2012

Assignment 1-Program Evaluation


Assignment 1-Program Evaluation

I had heard of a program called One Laptop Per Child (OLPC) through a pilot project I am part of with our school division.  I know that this program is meant for students who live in countries where technology is not easily attainable.  I do think that the goals of my program are similar in that we are trying to engage students, and improve their education.  I found a program evaluation done on the OLPC in Peru that was published in 2010.
Nicholas Negroponte was the founder of OLPC whose goal was to create affordable laptops to put into the hands of students in least developing countries.  He wanted to give them the opportunity to use technology to improve their education.
            Inter-American Development Bank (IDB) published a report in 2010, evaluating a program adopted by the Peruvian Government called One Laptop Per Child (OLPC).  IDB is the largest source of development financing for Latin America and the Caribbean, with a strong commitment to achieve measurable results” (http://www.iadb.org/en/about-us/about-the-inter-american-development-bank,5995.html
            I believe Scriven’s model was used in this program evaluation.  My only reservation around this is that it is not clear whether or not the authors knew beforehand the goals of the Peruvian government in relation to this project.  An assessment was developed in agreement between IDB and the Peruvian Ministry of Education (MINEDU).
            The program evaluation had both formative and summative evaluation as it had developed surveys to see how the program had impacted various groups of people including, teachers, students, parents and administrators.  It also investigated how to improve the program in Peru. 
            As in Scriven’s model, formative assessment was used through qualitative and quantitative data analysis. 
            The qualitative study provided information three months into the program from both schools that had and had not received laptops.  The information obtained provided the ability “to explore the impact on the attitudes, practices, and perceptions of the principals, teachers, students, parents, as well as to document the implementation process and explore the experiences, reactions and results of the distribution of computers.”(p.4)
            The quantitative study compiled data gathered from students, parents, teachers and principals.  The information was obtained through student testing and interviews, family, teacher and school principal questionnaires, as well as classroom observations. 
            In 2009 surveys were used to evaluate the program in selected schools that had been designated for the OLPC program.  As this was done in November, the majority of the schools that were being used in the evaluation process had only received their computers three months prior.  Therefore, the authors used the survey to gather data for short-term results to be used for future visits scheduled in October and November of 2010. 
            Through summative evaluation, strengths and weaknesses of the program were defined.  They found students and families were scared to take the laptops home in case they were to cause damage. Teaches initially felt inadequately trained and through the evaluation additional training could be provided.
            I found the evaluation to be very thorough.  By having its own objectives, the authors were able to create a document that will be useful for the program in Peru as well as other programs like this one, in South America.  The evaluators created a logical way to measure what affect the program had on those schools chosen and how those objectives were or were not being met.  Not only did the evaluation allow for an analysis into whether or not objectives were being achieved, but also how to initiate improvements to the program for future school involvement and improving the experience of those currently participating in the program.
            As far as weaknesses, I was not able to identify very many in the evaluation.  One concern I would have would be the possible premature analysis and evaluation. Is three months enough time to properly work with computers?  Another would be around the technology piece.  I felt that there should have been a stronger emphasis placed on technical support.

References

Inter-american Development Bank: http://idbdocs.iadb.org/wsdocs/getdocument.aspx?docnum=35422036
Click here for evaluation