Saturday, September 29, 2012

"Playing the Game" of Algebra Story Problems- Coordinating Reasoning and Representation

The importance of mathematics instruction including "real life" contexts relevant to students' experiences is widely acknowledged (Common Core State Standards Initiative, 2010; NCTM, 2000; 2006; 2009). Common justifications for contextualized mathematics include the idea that relevant contexts may help students to apply what they learn in school to out-of-school situations, and that relevant contexts may scaffold learning by providing a bridge between what students understand and the content they are trying to learn. 

The study reported here investigates these justifications using story problems on linear functions. A situated cognition theoretical framework is used to interpret student behavior in the complex, social system of "school mathematics." In a series of interviews, students from a low-performing school were presented with algebra problems, some of which were personalized to the ways in which they described using mathematics in their everyday lives. Results showed that students rarely explicitly used situational knowledge when solving story problems, had consistent issues with verbal interpretation, and engaged in non-coordinative reasoning where they bypassed the intermediate step of understanding the given situation before trying to solve the problem. Problems with the same mathematical structure with different amounts of verbal and symbolic support elicited different strategies from students, with personalized problems having high response rates and high use of informal strategies. This suggests that students can use sophisticated, situation-based reasoning on contextualized problems, and that different problem framings may scaffold learning. However, results also demonstrated that the culture of schooling, and story problems as an artifact of this culture, undermines many of the justifications for contextualizing mathematics. -Dr. Petrosino


PlayingThe Game

Picture: Hoboken's own Cake Boss (Buddy Valastro) created a special cake to celebrate Hoboken High School's 50th Anniversary. Pictured from (L-R) are: Hoboken teacher and 50th Anniversary Coordinator Mr. Chris Munoz, Board of Education Trustee Carmelo Garcia, and Ms. Nina Delia.

Tuesday, September 25, 2012

Is there a STEM (Science Technology Engineering Math) crisis?

The Torch Bearers
The following information comes largely from the people at Schools Matter. The topic is somewhat controversial since the common rhetoric seems to be about the so-called shortage of STEM professionals. Are there good jobs and occupations in the STEM disciplines? Certainly. But we must proceed cautiously in making sure there is actually research and statistical proof for so of our shared or common beliefs. -Dr. Petrosino 

It is not clear that there is a STEM shortage, Some analysts, in fact, have claimed that there is a surplus of STEM-trained professionals (Teitelbaum, 2007; Toppo and Vergano, 2009; Bracey, 2009.)

A report of the World Economic Federation (Schwab, 2012) confirms that there is no crisis. The US ranks near the top of the world on all categories related to STEM education and availability of expertise: According to the World Economic Federation, the US ranks 5th out of 144 countries in "availability of scientists & engineers," is tied for 5th in "quality of scientific research institutions," ranks 3rd in "university-industry research collaboration” and ranks 7th “capacity for innovation,” which means that American innovation comes largely from research efforts done in the US. 

Only three countries ranked in the top ten in all four of these categories: Sweden, Israel, and the US. China didn’t come close to the top ten in any of them.

"… the impending shortage of scientists and engineers is one of the longest running hoaxes in the country" (Bracey, 2009). 

Sources:

Bracey, G. 2009. Education Hell: Rhetoric Vs. Reality. Alexandra, VA: Educational Research Service.
Schwab, K. 212. The Global Competiveness Report, 2012-2013: Full Data Education. World Economic Forum.
Teitelbaum, M. 2007. Testimony before the Subcommittee on Technology and Innovation. Committee on Science and Technology, U.S. House of Representatives, Washington, DC, November 6, 2007 
Toppo, G. and Vergano, D. 2009. Scientist shortage? Maybe not. USA Today, August 9, 2009

Picture: In 1955 Anna Hyatt Huntington and her husband donated her sculpture “The Torch Bearers” to the University of Madrid. The work symbolizes “the passing of the torch of civilization from one generation to the next.” The version seen at Stevens Institute of Technology in Hoboken (pictured above), New Jersey is one of three replicas. It was donated to Stevens in 1964, where it joined the college’s growing collection of paintings and sculpture. Two other replicas of “The Torch Bearers” are located at the Museum of Art, Science, and Industry in Bridgeport, Connecticut and the Chrysler Museum in Norfolk, Virginia.

Miscalculations of District's Summer Report Confuses Scores for Grades 2 through 7 in Reading (SRI) and Mathematics (SMI) Diagnostic Assessments

Recently, the Hoboken School District released its District Progress Report #3 (Summer 2012). In many ways, it is a very typical "good news" type of publication that districts provide to parents and the public. The report includes district goals, sum me reading lists, some date on attendance, a number of mini-reports from each of the individual schools in the district (Brandt, Calabro, Connors, Hoboken High School and Wallace), some discussion of sports and a listing of students making the Honor Roll. There is also a letter from the Board President as well as some words from key district administrators (you can view the entire report as a PDF below).

One area of particular interest to me as well as some concern was a story entitled "Hoboken Students Demonstrate Strong Achievement" and contained information about pre and post test scores on the SRI (Scholastic Reading Inventory) as well as the SMI (Scholastic Mathematics Inventory) assessments.

One need only look at a recent post on a local blog to see how the misuse of this diagnostic information has been used to give a sense that state mandated test scores are on the rise. Please Click HERE to view the misleading blog entry.

Many regular readers of my blog will recall I brought the SRI into the district back in June of 2008. Having being used effectively for the next academic year (2008-2009) it was all but abandoned when Interim Superintendent Carter entered the district in September 2009 and the political group known as Kids First took full control of the Board of Education.

In the Summer 2012 Report SRI and SMI scores are reported from two time points. One at the start of the school year (September 2011) and one at the end of the school year (June 2012). Its unfortunate that this rather simple diagnostic test was given at only 2 points. The tests are fully computerized and intended to be given at multiple points throughout the school year (we gave them at the end of each marking period- giving 4 time points). Even more unfortunate is that someone reading this report may be confused and believe that the districts achievement (mandated) test scores are rising. This is unfortunately not the case. These are NOT the NJASK scores or the HSPA scores which are mandated by No Child Left Behind legislation. Rather, this is simply an in district diagnostic test intended to provide feedback for instruction.

More unfortunate is there appears to be a major error in the reporting of the percentage gains made during the 2011-2012 school year. When a quantity grows (gets bigger), then we can compute its PERCENT INCREASE.

This is a simple calculation:

PERCENT INCREASE = (new amount -original amount) / original amount

Unfortunately, this simple formula was inappropriately applied and the numbers reported by the district were reported incorrectly...and all errors make it appear that student gains were greater than realized.


Another error is that the percentages of the June 2012 administration of the SMI totals to 110% (25% + 30% + 22% + 33% = 110%). This is an obvious mistake and one that also leads to confusing post test results.
To be clear, the scores on these diagnostics are informative and positive but it is disappointing that they are riddled with computational errors (click picture to ENLARGE and see Excel spreadsheet). It also would have been informative to know 1) how many students took the assessments (n), 2) number of students who took the assessments per grade, and 3) did all students take both the pre and the post assessments.


It will be interesting to see how the actual Spring 2012 NJASK scores for Grades 2-7 correlate with these non-NCLB diagnostic scores. THAT would be an informative analysis and one that would show clearly the utility of the SRI and SMI diagnostics as well as how effectively they are being used by district Directors, Supervisors, and Curriculum Coordinators in providing effective instruction in grades 2 through 7.

------------------------------------------------------------------------------
------------------------------------------------------------------------------

So, what is the SRI? The SRI is group-administered screening assessment for identifying students’ reading levels and identifying students who are reading substantially below grade level. The SRI requires students to choose words to complete cloze sentences that represent the main idea of a short text. It includes a wide range of narrative and expository texts of increasing difficulty. The SRI provides teachers with scores that are on the same scale as the Lexile Framework—a readability scale making them easy to interpret and useful in matching students with texts that have a Lexile level.

What is the purpose of the assessment? -
“The tests are based on a powerful new system, called the Lexile Framework that can help you accurately assess your students’ comprehension levels and match them with appropriate texts for successful reading experiences.”

What is the test actually measuring?
Students’ ability to get the gist of short, narrative fiction and non-fiction passages sufficiently to correctly fill in the blank in short, summary sentences.

Overall Strengths
■ Matches students to texts using Lexile scale.
■ Scale is useful for showing growth over time.
■ Efficiency in identifying students who are reading significantly below grade level.
■ Usefulness for leveling students – the readability formula (a way of determining the difficulty level of a text)
provides a simple way for teachers to gauge the difficulty students experience when reading texts of various levels.

Overal Weaknesses
■ Does not provide information about skills such as analyzing, evaluating texts on an aesthetic basis,
appreciating or comparing texts.
■ Vocabulary is not assessed separately.
■ The SRI does not distinguish where the breakdown in comprehension occurs.
■ Length of the test; students could get tired because there are 54 different short passages.

Read more from an independent report by Carnegie Learning: CLICK HERE

The SMI (Scholastic Mathematics Inventory) is a research-based, computer-adaptive math assessment program for students in Grades 2 – 8+ that measures math understanding on The Quantile Framework® for Mathematics. The most powerful feature of SMI is its ability to administer fast and reliable low-stakes assessment to inform instruction and make accurate placement recommendations. Aligned to the Common Core State Standards, SMI helps educators forecast student achievement to those important goals.





Summer 2012 Newsletter

Friday, September 14, 2012

University of Texas at Austin Evacuated

Thank you for all the inquiries concerning the evacuation at The University of Texas. All is well. A call was certainly made and it seemed credible to the authorities but nothing has been found on campus. University closed for the rest of the day. Peace--

Wednesday, September 12, 2012

Xavier President Jack Raslowsky Named a Legend of Prep

Saint Peter's Prep, New Jersey’s Jesuit high school, recently announced that Xavier President Jack Raslowsky was named a Legend of St. Peter's Prep and will be honored at a dinner hosted by the school on Sept. 14th.


Jack Raslowsky

Mr. Raslowsky is a member of the Prep Class of 1979. Over the years he served as soccer coach, teacher, retreat leader and moderator of numerous activities. In 1992, he was named Prep’s first lay principal and served in that role until 2003 when he joined the Jesuit provincial staff where he served as the provincial assistant for education and lay formation. During his time as principal, Prep’s enrollment grew, lacrosse and volleyball were added as interscholastic sports, AP offerings increased, the art and music programs saw significant growth and the school community placed renewed focus on issues of mission and Jesuit identity. Mr. Raslowsky was named Xavier’s first lay president in June 2009.

The first legends Class was inducted in 1993 to celebrate alumni and others in the Prep community who have had a profound positive impact on the school and the greater community by exemplifying the ideals of Jesuit education. The Legends of Prep awards are the highest honor presented by the Prep Alumni Association.The Sept. 14th dinner at Mayfair Farms, in West Orange, NJ, will also highlight the accomplishments of several other honorees, including Br. Paul Harrison, S.J., Dr. Rich Kennedy, Tom McGinty, Prep ’71, Jack Savage, Prep ’57 and Br. Joseph Wuss, S.J.

When asked about his induction, Raslowsky commented: "It is a thrill to be named a Legend of Prep. My long association with the Society and the work of Jesuit education began at St. Peter’s and my days as a student and faculty member there were gifts beyond measure. Prep opened my eyes to a world 'charged with the grandeur of God' and my life has not been the same since. Without my association with St. Peter’s, being at Xavier would have been an impossible dream, so my time at Prep is a gift that keeps on giving. This honor is a reminder of the common ground all Jesuit institutions share. I am particularly honored to be inducted with Br. Paul Harrison, S.J. and Br. Joe Wuss, S.J. I worked closely with them both through the years and they were outstanding Jesuits and friends."

Xavier has other connections to this year’s Prep Legends. Mr. Jack Savage is the brother-in-law of Xavier Hall of Fame member and former board chair, Mr. Gene Rainis ’58 and Mr. Tom McGinty serves as the owner’s representative for Xavier’s 15th St. project and master plan.

For more information on the Legends of Prep Dinner or to register today, click here.


Classes at the Hoboken Charter School Cancelled for Rest of the Week

Classes at the Hoboken Charter School will be canceled for the rest of the week, said George Cahn, a spokesman for the school on Monday afternoon.

"School administrators and the Board have narrowed down the viable options for interim space and hope to make a decision very shortly," Cahn, also a parent of a student at the school, wrote in an email, "with the goal of getting the kids back into class next week in an environment that won’t compromise the service-learning program."

Hoboken Patch reports that no official announcement was made as of Monday afternoon concerning relocation.


Picture: View of dark clouds from the Shipyard.

Saturday, September 8, 2012

Hoboken Charter School Fire Fundraiser

Please come out and support the Hoboken Charter School which recently suffered a terrible fire to its home on Washington Street.

Are Standardized Tests Worthless? - Diane Ravitch's Blog

All of U.S. education policy is now firmly hitched to standardized test scores.

Although the President said in his last State of the Union address that teachers should not teach to the test, he surely knows that federal policy demands teaching to the test.

Test scores determine teacher evaluation, teacher salary, teacher tenure, teacher bonuses. Test scores determine whether teachers and principals are fired. Test scores determine whether schools get closed or commended.

Test scores determine whether students are promoted or held back.

A professor at the University of Texas has concluded that the standardized tests are not reliable or valid. He says they predict how students will do in the future in relation to how well they have done on the same standardized tests in the past. They do not show what students have learned.

The story begins:

“In 2006, a math pilot program for middle school students in a Dallas-area district returned surprising results.

“The students’ improved grasp of mathematical concepts stunned Walter Stroup, the University of Texas at Austin professor behind the program. But at the end of the year, students’ scores had increased only marginally on state standardized TAKS tests, unlike what Mr. Stroup had seen in the classroom.

“A similar dynamic showed up in a comparison of the students’ scores on midyear benchmark tests and what they received on their end-of-year exams. Standardized test scores the previous year were better predictors of their scores the next year than the benchmark test they had taken a few months earlier.

“Now, in studies that threaten to shake the foundation of high-stakes test-based accountability, Mr. Stroup and two other researchers said they believe they have found the reason: a glitch embedded in the DNA of the state exams that, as a result of a statistical method used to assemble them, suggests they are virtually useless at measuring the effects of classroom instruction.”

Read the whole story and re-read that last line: the tests are “virtually useless at measururing the effects of classroom instruction.”

Think of it: we have a multi-billion dollar industry that sucks resources out of the classroom, whose tests are best at predicting how students will perform on next year’s tests. The test measure each other. They are designed to do that. They demand teaching to the test.

I don’t know whether the professor’s concerns are right. Others with technical expertise will weigh in . But what was obvious before he spoke out is that these tests are not good enough to carry the weight of determining our social structure, let alone the lives of students and teachers and principals and the fate of their schools.

Thursday, September 6, 2012

Fire at Hoboken Charter School; Students Evacuated; 7 Firemen Treated for Smoke Inhalation

Full story by Hoboken Patch- updated often as story develops. Click HERE this was the site of the old Sacred Heart School on Washington St in Hoboken.

A fire broke out at Hoboken Charter School this morning, injuring five firefighters and prompting the evacuation of 200 students and faculty in the school and at a next-door daycare center, authorities said.
An emergency worker on the scene said the fire started on the top floor of the building shortly before 10:30 a.m. at the school, located at Seventh and Washington streets. Six firemen from Hoboken and one from North Hudson were treated for smoke inhalation, Molta said.

"School administrators and board members are presently looking at and evaluating various options for interim instructional space in Hoboken for the K-8 classes," said George Cahn, from Cahn Communications, who is also a parent at the Hoboken Charter School. Classes were canceled on Friday as well as Monday, Sept. 10., for the K through 8th grade students of the school.

See additional local coverage by NJ.COM
See additional local coverage on Hoboken 411 by clicking HERE.

Photo: Claire Moses - Hoboken Patch
More pictures: HERE HERE HERE HERE HERE HERE

Design Flaw Suspected In Texas Standardized Tests - Morgan Smith (Texas Tribune)

A recent story by Morgan Smith of the Texas Tribune has gained a fair amount of attention over the past few weeks. The story highlights the work of a colleague of mine, Walter Stroup, and his work on high stakes, large scale state student accountability testing in the state of Texas with possible implications nationwide. -Dr. Petrosino

In 2006, a math pilot program for middle school students in a Dallas-area district returned surprising results.

The students’ improved grasp of mathematical concepts stunned Walter Stroup, the University of Texas at Austin professor behind the program at Richardson Independent School District. But at the end of the year, students’ scores had increased only marginally on state standardized TAKS tests, unlike what Stroup had seen in the classroom.

A similar dynamic showed up in a comparison of the students’ scores on midyear benchmark tests and what they received on their end-of-year exams. Standardized test scores the previous year were better predictors of their scores the next year than the benchmark test they had taken a few months earlier.

Now, in studies that threaten to shake the foundation of high-stakes test-based accountability, Stroup and two other researchers said they believe they have found the reason: a glitch embedded in the DNA of the state exams that, as a result of a statistical method used to assemble them, suggests they are virtually useless at measuring the effects of classroom instruction.

Pearson, which has a five-year, $468 million contract to create the state’s tests through 2015, uses “item response theory” to devise standardized exams, as other testing companies do. Using IRT, developers select questions based on a model that correlates students’ ability with the probability that they will get a question right.

That produces a test that Stroup said is more sensitive to how it ranks students than to measuring what they have learned. Such a design flaw could also explain why Richardson students’ scores on the previous year’s TAKS test were a better predictor of performance on the next year’s TAKS test than the benchmark exams were, he said. The benchmark exams were developed by the district, the TAKS by the testing company.

Stroup, who is preparing to submit the findings to multiple research journals, presented them in June at a meeting of the House Public Education Committee. He said he was aware of their implications for a widely used and accepted method of developing tests, and for how the state evaluates public schools.

“I’ve thought about being wrong,” Stroup said. “I’d love if everyone could say, ‘You are wrong, everything’s fine,’ ” he said. “But these are hundreds and hundreds of numbers that we’ve run now.”

Gloria Zyskowski, the deputy associate commissioner who handles assessments at the Texas Education Agency, said in a statement that the agency needed more time to review the findings. But she said that Stroup’s comments in June reflected “fundamental misunderstandings” about test development and that there was no evidence of a flaw in the test.

After a lengthy back and forth at the meeting, the committee’s chairman, Rob Eissler, suggested a “battle of the bands” — a hearing where the test vendors and researchers traded questions. Eissler, Republican of The Woodlands, said recently that he found Stroup’s research “very interesting” and that he was weighing another hearing.

Stroup’s research comes as growing opposition to high-stakes standardized testing in Texas is creating an alliance between parents, educators and school leaders who wonder how the tests affect classroom instruction and small-government conservatives who question the expense and bureaucracy they impose.

This is not the first time that the way the state uses standardized test scores in the accountability system has been questioned. In 2009, the state implemented the Texas Projection Measure, a formula that factored a prediction of students’ future performance on state exams into schools’ accountability ratings. Lawmakers, led by state Rep. Scott Hochberg, attacked the measure, saying it allowed schools to count students as passing who did not. After outcry prompted the education agency to issue ratings with and without the measure in 2010, the state dropped it completely the next year.

Hochberg, D-Houston, has since proposed legislation aimed at reforming the role of standardized testing in public schools because of the data he saw as he led the charge against the measure. It showed that a student’s test score on the first year highly predicted it for the next.

“I have for a long time said that the accountability system doesn’t give us all the information that the numbers are used to generate,” Hochberg said, adding that basing accountability “more on the kid’s history than the specifics of what happened in the classroom that year may make us feel good but it doesn’t give us any true information.”


Picture: Dr. Stroup being interviewed for this article.