"Testing, Testing, 1, 2, 3..."
9.2K views | +0 today
Follow
"Testing, Testing, 1, 2, 3..."
This collection has been gathered to raise awareness about concerns related to high-stakes standardized tests as well as assessments aimed at attempting to measure social and emotional characteristics among youth. The collection serves as a research tool to organize online content. There is a grey funnel shaped icon at the top right corner of the screen (in desktop view mode) where one can enter keyword searches of content (such as PARCC, SBAC, SAT, Pearson, validity, etc.).  Readers are encouraged to explore related links within each post for additional information. Views provided here are for information only and do not necessarily constitute an official position of the curator nor her employer. For more updates, see Educator Resources tab at http://EduResearcher.com [Links to external site].
Your new post is loading...
Your new post is loading...
Scooped by Roxana Marachi, PhD
Scoop.it!

Educational Polices and Youth in the 21st Century: Problems, Potential, and Progress // Edited by Sharon Nichols 

Educational Polices and Youth in the 21st Century: Problems, Potential, and Progress // Edited by Sharon Nichols  | "Testing, Testing, 1, 2, 3..." | Scoop.it

Educational Policies and Youth in the 21st Century // Edited by Sharon L. Nichols, University of Texas at San Antonio]

 

Published 2016

"As our student population diversifies rapidly, there is a critical need to better understand how national, regional, and/or local policies impact youth in school settings. In many cases, educational policies constructed with the goal of helping youth often have the unintended consequence of inhibiting youth’s potential. This is especially the case when it comes to youth from historically underrepresented groups. Over and over, educational legislation aimed at improving life for youth has had the negative effect of eroding opportunities for our most vulnerable and often times less visible youth.

The authors of this book examine the schooling experiences of Hispanic, African American, Indigenous, poor, and LGBT youth groups as a way to spotlight the marginalizing and shortsighted effects of national education language, immigration, and school reform policies. Leading authors from across the country highlight how educational policies impact youth’s development and socialization in school contexts. In most cases, policies are constructed by adults, implemented by adults, but are rarely informed by the needs and opinions of youth. Not only are youth not consulted but also policymakers often neglect what we know about the psychological, emotional, and educational health of youth. Therefore, both the short and long term impact of these policies have but limited effects on improving students’ school performance or personal health issues such as depression or suicide.

In highlighting the demographic and cultural shifts of the 21st century, this book provides a compelling case for policymakers and their constituents to become more sensitive to the diverse needs of our changing student population and to advocate for policies that better serve them.

 

CONTENTS
Preface. Acknowledgments.
PART I: CHARACTERISTICS AND EXPERIENCES OF 21ST CENTURY YOUTH.
Educational Policy and Latin@ Youth in the 21st Century, P. Zitlali Morales, Tina M. Trujillo, and René Espinoza Kissell.

The Languaging Practices and Counternarrative Production of Black Youth, Carlotta Penn, Valerie Kinloch, and Tanja Burkhard.

Undocumented Youth, Agency, and Power: The Tension Between Policy and Praxis, Leticia Alvarez Gutiérrez and Patricia D. Quijada.

Sexual Orientation and Gender Identity in Education: Making Schools Safe for All Students, Charlotte J. Patterson, Bernadette V. Blanchfield, and Rachel G. Riskind.

Youths of Poverty, Bruce J. Biddle.

PART II: PROMINENT EDUCATIONAL POLICIES AFFECTING YOUTH.

Language Education Policies and Latino Youth, Francesca López. The Impact of Immigration Policy on Education, Sandra A. Alvear and Ruth N. López Turley.


Mismatched Assumptions: Motivation, Grit, and High‐Stakes Testing, Julian Vasquez Heilig, Roxana Marachi, and Diana E. Cruz.

PART III: IMPLICATIONS FOR BETTER POLICY DEVELOPMENT FOR 21ST CENTURY YOUTH.
Searching Beyond Success and Standards for What Will Matter in the 21st Century, Luke Reynolds.

New Policies for the 21st Century, Sharon L. Nichols and Nicole Svenkerud‐Hale.
Social Policies and the Motivational Struggles of Youth: Some Closing Comments, Mary McCaslin."

 

http://www.infoagepub.com/products/Educational-Policies-and-Youth-in-the-21st-Century 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Mississippi Fires [Pearson] Testing Firm After Exams Wrongly Scored

Mississippi Fires [Pearson] Testing Firm After Exams Wrongly Scored | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Jeff Amy

JACKSON, Miss. (AP)

"The Mississippi Department of Education is firing a testing company, saying scoring errors raise questions about the graduation status of nearly 1,000 students statewide.

The state Board of Education revoked a contract with NCS Pearson in closed session Friday, after the Pearson PLC unit told officials it used the wrong table to score U.S. history exams for students on track to graduate this spring. Students who did poorly got overly high scores, while those who did better didn’t get enough credit.

 
Associate Superintendent Paula Vanderford says it’s too soon to know how many students may have graduated or been denied diplomas in error, or what the state will do about either circumstance.

Pearson spokeswoman Laura Howe apologized on behalf of the company and said Pearson is working to correct the scores.

 

“We are disappointed by today’s board decision but stand ready to assist the state in any way possible,” she wrote in an email.

 

Students typically study U.S. history in their third year in high school, and take the subject test that spring. Students who score poorly, though, can take the test up to three more times as a senior. The 951 students in questions were either seniors, or juniors scheduled to graduate early, and needed their scores to earn diplomas.

 

The answers about graduating students will be tricky because students have different options to graduate. Formerly, every student had to pass each of Mississippi’s four subject tests in biology, history, algebra and English to earn a high school diploma. Now, students can fail a test and still graduate if class grades are high enough, they score well enough on other subject tests, they score above 17 on part of the ACT college test, or earn a C or better in a college class.

 

Eventually, the tests will count for 25 percent of the grades in each subject.

 

About 27,000 students took the test overall. Vanderford said scores for each one will have to be verified. The exam scores also affect the grades that Mississippi gives to public schools and districts.

 

“The agency is committed to ensuring that the data is correct,” she said.

 

Vanderford said Pearson has had other problems with its Mississippi tests. In 2012, a scoring error on the high school biology exam wrongly denied diplomas to five students. Pearson compensated them with $50,000 scholarships to any Mississippi university. Another 116 student who were affected less severely got $10,000 or $1,000 scholarships. In 2015, Pearson paid the state $250,000 after its online testing platform crashed for a day.

 
Pearson had a contract worth a projected $24 million over the next six years to provide tests for history, high school biology, 5th grade science and 8th grade science. The board hired Minnesota-based Questar Assessment to administer all those tests for one year for $2.2 million.
 
Questar, which is being bought by nonprofit testing giant ETS, already runs all of Mississippi’s language arts and math tests. Because Mississippi owns the questions to the history and science tests, Vanderford said it will be possible for Questar to administer those exams on short notice. The state will seek a contractor to give those tests on a long-term basis in coming months."

 

For main story, please see: 

https://apnews.com/115d48fe350843d6baa60bc277fd1bc8 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Pearson Botches Mississippi Testing [Again]; Mississippi Immediately Severs Contract

Pearson Botches Mississippi Testing [Again]; Mississippi Immediately Severs Contract | "Testing, Testing, 1, 2, 3..." | Scoop.it

"Education and testing mammoth Pearson has an established history in botching high-stakes testing.

 

Pearson did it again, in Mississippi.

 

According to the Associated Press (AP), Mississippi canceled its contract with the testing giant after Pearson fessed up to mixing up scoring tables for an exam that now has approximately 1,000 Mississippi students either graduating when exit scores were not actually high enough or not graduating because of test scores that were not too low after all.

From AP on Friday, June 16, 2017:
"The Mississippi Department of Education is firing a testing company, saying scoring errors raise questions about the graduation status of nearly 1,000 students statewide.

 

The state Board of Education revoked a contract with NCS Pearson in closed session Friday, after the Pearson PLC unit told officials it used the wrong table to score U.S. history exams for students on track to graduate this spring. Students who did poorly got overly high scores, while those who did better didn’t get enough credit. Associate Superintendent Paula Vanderford says it’s too soon to know how many students may have graduated or been denied diplomas in error, or what the state will do about either circumstance."

 

The AP release continues with an inept-yet-contrite Pearson will “assist the state in any way possible.”

 

Of course, the way to assist the state is to not put the state in this awful position to begin with.

 

And it’s not the first time Pearson incompetence has caused Mississippi problems.

 

As the AP continues:
"In 2012, a scoring error on the high school biology exam wrongly denied diplomas to five students. Pearson compensated them with $50,000 scholarships to any Mississippi university. Another 116 student who were affected less severely got $10,000 or $1,000 scholarships. In 2015, Pearson paid the state $250,000 after its online testing platform crashed for a day.

 

What is astounding is that even as Pearson profits are suffering to a record extent, its CEO, John Fallon, received a 20-percent pay raise in May 2017.

 

From the May 05, 2017, Telegraph:

"Two thirds of shareholders rejected the company’s remuneration report at its AGM after Mr Fallon received a £343,000 [$439,383] bonus, equivalent to a 20pc [percent] pay rise, despite having presided over its worst 12 months in nearly half a century on the stock exchange.


Mr Fallon’s position was undermined as 66pc of shareholders voted against his pay in a meeting marked by protests from teaching unions over Pearson’s activities in the developing world. …

Earlier in the day, Mr Fallon had sought to calm criticism of his bonus by spending all of it, net of tax, on Pearson shares to align his own interests with those of shareholders.


He declined to comment on whether he considered rejecting the bonus, which came after a £2.6bn [$3.34 billion] annual loss and the biggest ever one-day fall in Pearson’s shares following a massive profit warning. …


Despite the controversy, the shares were up nearly 12pc in the afternoon after Pearson unveiled a new £300m [$384 million] tranche of job cuts and office closures, in the latest phase of Mr Fallon’s battle to reverse its fortunes. His third round of restructuring comes after 4,000 staff were cut last year, when it sought similar savings."

 

Indeed, Fallon is being rewarded for throwing the crew overboard on a poison ship that is taking more water than ever.

 

It seems, however, that the Mississippi Board of Education has finally had enough of Pearson."

 

For original post, see: 
https://deutsch29.wordpress.com/2017/06/17/pearson-botches-mississippi-testing-again-mississippi-immediately-severs-contract/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Pearson Explores Sale of Its U.S. K-12 Curriculum Business // EdWeek Market Brief

Pearson Explores Sale of Its U.S. K-12 Curriculum Business // EdWeek Market Brief | "Testing, Testing, 1, 2, 3..." | Scoop.it

"Pearson, the largest education company in the world, announced today that it’s considering selling off its U.S. K-12 digital and print curriculum business, citing the “slow pace of digital adoption” in schools.

 

Besides that issue, the company cited a “challenging competitive and market environment” and the high capital needs of the digital curriculum market as reasons for its announcement of a strategic review of that portion of the business.

 

The U.S. Learning Services business, as Pearson refers to the division in question, sells K-12 print, digital, and blended curriculum, and includes products like enVision Math and iLit. It does not include Advanced Placement (AP), career and technical education, or online courses taken in high school, a company spokesman said.

 

The announcement was part of the company’s sharing of its 1st quarter 2017 results. A release accompanying that report touted the company’s “progress” in accelerating some aspects of digital delivery of the company’s content, focusing on its higher education and K-12 efforts in that vein.

 

For K-12, the company said its future focus will be in three areas: investing in virtual schools via Connections Education, which the company said is one of its fastest-growing businesses; building on the company’s position in U.S. school assessment, and “powering online learning,” by investing in digital courses for use in blended and virtual teaching within physical schools.

 

The influence and outcomes of virtual schools or cyber charters were examined in a November 2016 Education Week investigation by Ben Herold and Arianna Prothero. Connections Academy, an online school provider for grades K-12 that Pearson purchased in 2011. was covered in the series of stories.  Pearson’s Connections Education provided a  defense of cyber charters in the Education Week coverage.

 

No timeline of the potential sale or valuation of the business is available at this time, said a company spokesman."

 

For full post, see: https://marketbrief.edweek.org/marketplace-k-12/pearson-explores-sale-u-s-k-12-curriculum-business/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Some Long Island Students Given Wrong Common Core Exam // NewsDay 

Some Long Island Students Given Wrong Common Core Exam // NewsDay  | "Testing, Testing, 1, 2, 3..." | Scoop.it

"Sixty-four third-graders — including 12 from Long Island — were erroneously given exams meant for fourth-graders on the first day of computer-based English Language Arts"

 

http://www.newsday.com/long-island/education/some-li-third-graders-given-wrong-common-core-exam-1.13325036 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Nevada Accepts $1.8M Settlement Over [SBAC] Student Testing Program // Review Journal 

Nevada Accepts $1.8M Settlement Over [SBAC] Student Testing Program // Review Journal  | "Testing, Testing, 1, 2, 3..." | Scoop.it
By Sean Whaley // Las Vegas Review-Journal Capital Bureau. [Photo credit: Jeff Scheid, Las Vegas Review Journal]

"CARSON CITY — A state panel on Tuesday approved a $1.8 million settlement in favor of the Nevada Department of Education over a botched student testing program.


The settlement with the Smarter Balanced Assessment Consortium, a part of the UCLA Graduate School of Education and Information Studies, follows a $1.3 million settlement reached last year with Measured Progress Inc. The Smarter Balanced part of the contract involved providing test content and the platform to provide the testing.

The settlements will allow Nevada to avoid litigation over the failed testing system that prevented thousands of Nevada students in grades three through eight from taking federally mandated assessments under new Common Core standards in the spring of 2015.

The Board of Examiners, including Gov. Brian Sandoval, approved the Smarter Balanced settlement, which is composed of several parts, including goods and services from the organization worth nearly $1 million. Another $100,000 is being spent by the organization to hire a firm to assess the validity of the 2015 criterion-referenced test scores.

 

The 2016 testing effort was accomplished with no major glitches, said Greg Bortolin, public information officer for the state Department of Education.

“In light of what happened in the previous year this was really good news,” he said.

Only 30 percent of the roughly 214,000 students expected to take the online tests in 2015 successfully completed the assessments because the system repeatedly crashed and many students were unable to log into the testing server. School officials eventually gave up.

Sandoval acknowledged the good news this year, but noted that the 2015 effort was a disaster that almost put the state’s federal funding at risk.

The settlements also show clearly that the two entities were responsible for the failures in 2015, he said.

 

“The people who got hurt were the kids,” Sandoval said."...

 

For full post, click on title above or here: 
http://m.reviewjournal.com/news/education/nevada-accepts-18m-settlement-over-student-testing-program 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Who’s Marking Those Common Core High-Stakes Tests? 

Who’s Marking Those Common Core High-Stakes Tests?  | "Testing, Testing, 1, 2, 3..." | Scoop.it

http://www.huffingtonpost.com/entry/whos-marking-those-common-core-high-stakes-tests_us_5922bf13e4b0b28a33f62dcd 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Technical Glitches Plague Computer-Based Standardized Tests Nationwide // Washington Post

Technical Glitches Plague Computer-Based Standardized Tests Nationwide // Washington Post | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Emma Brown

"As most states have moved to new standardized tests based on the Common Core during the past two years, many also have switched from administering those tests the old-fashioned way — with paper and No. 2 pencils — to delivering them online using computers, laptops and tablets.

 

The transition aims to harness the power of technology to move beyond simplistic multiple-choice questions, using interactive questions and adaptive techniques to measure students’ critical thinking and problem-solving skills.

 

But the shift to computer-based testing has been riddled with technical glitches that have spanned many testing companies and states, including those that have adopted Common Core and those using other new academic standards.

 

Stressed-out students have found they sometimes can’t log on to their exams or are left to panic when their answers suddenly disappear. Frustrated teachers have had to come up with last-minute lesson plans when testing fails. Some school systems — and even entire states — have had to abandon testing altogether because of Internet hiccups thousands of miles away."....

 

For full post, click on title above or here:

https://www.washingtonpost.com/local/education/technical-glitches-plague-computer-based-standardized-tests-nationwide/2016/04/13/21178c7e-019c-11e6-9203-7b8670959b88_story.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

What’s Wrong With The U.S. News Best High School Rankings? // via The Progressive 

What’s Wrong With The U.S. News Best High School Rankings? // via The Progressive  | "Testing, Testing, 1, 2, 3..." | Scoop.it

"Every time U.S. News and World Report issues its Best High School Rankings Index, I think of basketball. Here’s why. If you look at the CBS Sports list of top-ranking high school basketball teams in Pennsylvania, of the top twelve ranking programs only one is a public school. The rest include charter schools, Catholic schools, one private academy, and a Quaker boarding school.

I see two possible explanations for the lack of public schools: either those private schools know some important secrets about coaching basketball, or they benefit from being able to recruit and select the best players for their teams. I’m betting it’s the latter.

 

So when U.S. News announces that charters are marching up the rankings list, it’s pretty important to take a peek at just how those schools are assessed.

The selection method is a curious one, based on a series of hurdles. First, the school must show that it performed “better than expected” on the Big Standardized Test for its state (e.g. PARCC or SBA). “Better than expected” is based on a statistical model developed to look at genetic trends in cattle. I kid you not. It compares actual test results with an ideal alternative universe. If the real universe student does better than what the model predicts, the model assumes that’s because the teacher and school did something right.The technique has been criticized by statisticians and educators alike, but it remains the first hurdle that a U.S. News Super School must jump. (There is one loophole—all schools that score in the top 10 percent for their state automatically qualify, whether they beat expectations or not.)

The second hurdle is the scores of select groups like “black, Hispanic and low-income” viewed separately from the whole school population. A third hurdle is the school’s graduation rate—is it over 75 percent? Many charter schools are adept at getting around that one, by pushing out low-performing students before they reach senior year. Looking at how many freshmen make it all the way through graduation would be more telling.

Finally, the schools that have made it this far are judged on the AP test rate—how many students take how many advance placement tests. Since the AP test is a product sold by the College Board company, clearing this last gate means that somebody needs to fork over some money. Rating a school based on its AP test rate is like rating a school based on how many of its faculty members drive a Lexus instead of a Ford or BMW.

In fact, because the International Baccalaureate (an AP test competitor) didn’t supply test result data this year, those schools could not win the Super Best Gold Medal. South Dakota, for whatever reason, also refused to play along with Step Four.

So, in short, a U.S. News Tippy Top School is one which enrolls lots of students who are excellent test takers, some of whom belong to a minority group, and who are also willing to—and can afford to—take a bunch of AP tests. That is what’s behind being called a “Great School” by U.S. News.

 

U.S. News does its selection in conjunction with RTI International, a North Carolina research group, and this particular project is overseen by Ben Dalton. Dalton has a sociology PhD from Duke. After working at American Institutes for Research, the test manufacturer that brought us the Common Core SBA test, he moved to RTI. So, the system for ranking America’s top high schools is run by someone with no actual education background.

And how well does his system actually work?

 

Well, the fourth best high school in New York doesn’t exist. Blogger Gary Rubinstein did some digging and discovered that the KIPP Academy Charter High School is a clerical collection of about one-third of all the high school students actually enrolled in four KIPP schools in New York.

The BASIS schools that top the U.S. News list do exist, but are a textbook example of how to use a charter school to fill your own pockets. The BASIS chain is essentially one married couple—Michael and Olga Block—who have made millions of dollars for themselves and their family members.

 

So what, you ask, if they are getting good results? Well, now we’re back to basketball rankings again. Arizona’s student population is about 45 percent Latino; BASIS Latino population is 10 percent. Arizona students are 39 percent white; BASIS is 51 percent white. Arizona students are 3 percent Asian; BASIS is 32 percent. And BASIS enrollment figures show no Native Americans at all. Almost half of Arizona students receive free or reduced lunch. BASIS does not participate in that program at all, so their 0 percent “free lunch” number doesn’t mean they have no poor students, but it does mean that those students must give up that benefit in order to attend BASIS schools!

 

Charter fans have been gleeful over the list, declaring that the list “teaches that charter schools are working.” If we believe that a “working” school is one where a curated group of students get high standardized test scores and take lots of AP tests, then this victory lap might be justified. Of course, it might also be true that only Catholics really know how to play basketball. But somehow, I don’t think so."...

 

http://progressive.org/public-school-shakedown/what%E2%80%99s-wrong-with-the-u-s-news-best-high-school-rankings/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Rethinking the Use of Tests: A Meta-Analysis of Practice Testing // Review of Education Research

Rethinking the Use of Tests: A Meta-Analysis of Practice Testing // Review of Education Research | "Testing, Testing, 1, 2, 3..." | Scoop.it

"Abstract
The testing effect is a well-known concept referring to gains in learning and retention that can occur when students take a practice test on studied material before taking a final test on the same material. Research demonstrates that students who take practice tests often outperform students in nontesting learning conditions such as restudying, practice, filler activities, or no presentation of the material. However, evidence-based meta-analysis is needed to develop a comprehensive understanding of the conditions under which practice tests enhance or inhibit learning. This meta-analysis fills this gap by examining the effects of practice tests versus nontesting learning conditions. Results reveal that practice tests are more beneficial for learning than restudying and all other comparison conditions. Mean effect sizes were moderated by the features of practice tests, participant and study characteristics, outcome constructs, and methodological features of the studies. Findings may guide the use of practice tests to advance student learning, and inform students, teachers, researchers, and policymakers. This article concludes with the theoretical and practical implications of the meta-analysis."

 

Adesope, O.O., Trevisan, D.A., Sundarajan, N. (2017). Rethinking the use of tests: A meta-analysis of practice testing. Review of Educational Research. Available online at: http://journals.sagepub.com/doi/abs/10.3102/0034654316689306?journalCode=rera 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Predicting Middle Level State Standardized Test Results Using Family and Community Demographic Data // RMLE Research in Middle Level Education

Predicting Middle Level State Standardized Test Results Using Family and Community Demographic Data // RMLE Research in Middle Level Education | "Testing, Testing, 1, 2, 3..." | Scoop.it

Abstract
The use of standardized test results to drive school administrator evaluations pervades education policymaking in more than 40 states. However, the results of state standardized tests are strongly influenced by non-school factors. The models of best fit (n = 18) from this correlational, explanatory, longitudinal study predicted accurately the percentage of middle school students scoring proficient or above on the New Jersey state-mandated standardized tests in mathematics and language arts for grades 6–8 during the years 2010, 2011, and 2012 for 70% to 78% of the schools in the statewide samples (n = 292 to 311), using only family and community demographic variables from the U.S. Census. Just three demographic variables, (a) percentage of families in a community with income over $200,000 a year, (b) percentage of people in a community in poverty, and (c) percentage of people in a community with bachelor’s degrees, predicted results accurately in 14/18 of the models. The findings suggest that state standardized test results are not as objective and transparent as advertised by state and federal department of education officials."...

Tienken, C.H., Colella, A., Angellilo, C., Fox, M., McCahill, K.R., &

Wolfe, A. (2016). Predicting Middle Level State Standardized Test Results Using Family and Community Demographic Data. RMLE Online: Vol. 40, No. 1, pp. 1-13. doi: 10.1080/19404476.2016.1252304

http://www.tandfonline.com/doi/full/10.1080/19404476.2016.1252304 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Vermont State Board of Education Sends Letter to Parents Discrediting Smarter Balanced Test Scores

Vermont State Board of Education Sends Letter to Parents Discrediting Smarter Balanced Test Scores | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Valerie Strauss 
November 7th, 2015 


"It’s not common for education policymakers to tell parents that they can give short shrift to their child’s scores on Common Core standardized tests (or on pretty much any test, for that matter), but that’s what the Vermont State Board of Education has just done.


Meeting earlier this week, the board, which includes the state’s education secretary, Rebecca Holcombe, approved a remarkable message for parents about scores on the 2015 Common Core tests known as SBAC, for the multi-state Smarter Balanced Assessment Consortium, which created the exams.


The SBAC, along with tests created by another multi-state consortium, the Partnership for the Assessment of Readiness for College and Careers, or PARCC, were designed to be more sophisticated and better able to evaluate what students have learned than earlier-generation standardized tests. But the exams are not the “game-changing” assessment instruments the Obama administration — which funded their creation — had predicted because of time and money constraints.


With the recent release of the 2015 scores from tests taken in the spring, Vermont’s State Board, of which Holcombe is a member, approved a memorandum telling parents and guardians not to worry about the results because their meaning is at best limited. It says in part:


"We call your attention to the box labeled “scale score and overall performance.” These levels give too simplistic and too negative a message to students and parents. The tests are at a very  high level. In fact, no nation has ever achieved at such a level. Do not let the results wrongly discourage your child from pursuing his or her talents, ambitions, hopes or dreams.


These tests are based on a narrow definition of “college and career ready.” In truth, there are many different careers and colleges and there are just as many different definitions of essential skills. In fact, many (if not most) successful adults fail to score well on standardized tests. If your child’s scores show that they are not yet proficient, this does not mean that they are not doing well or will not do well in the future.


We also recommend that you not place a great deal of emphasis on the “claims” or sub-scores. There are just not enough test items to give you reliable information."...


##


For main post and link to full letter, click here:
https://www.washingtonpost.com/news/answer-sheet/wp/2015/11/07/vermont-to-parents-dont-worry-about-your-childs-common-core-test-scores-they-dont-mean-much/

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Error Invalidates Hundreds of ISTEP+ Math Scores // Journal Gazette Indiana

 

By Niki Kelly // The Journal Gazette
"INDIANAPOLIS - A testing error has invalidated hundreds of student ISTEP+ math scores around the state, including at three area schools.

 

The students were mistakenly given access to calculators on a section of the test where calculators were not allowed.

 

The Indiana Department of Education and school officials say testing vendor Pearson is to blame for the error.

 

"It's so discouraging for the children. It's discouraging for everyone," said Lori Vaughn, assistant superintendent at DeKalb Central United schools. "It is what it is. I hate that expression but we're going to move on. It's a black eye when DOE puts (scores) out."

 

She said 34 students in third grade at Waterloo Elementary and 19 students in fourth grade at the school will receive "undetermined" scores. This results in passing rates of less than a percent for third grade and 17 percent for fourth.

 

"It's horrific," Vaughn said. "And that's what's going to be put out with no explanation. It will impact our participation rate and our accountability grade."

 

Test scores are a large factor in the A to F accountability grades that schools will receive later this year.

 

Department of Education officials told Vaughn there is nothing that can be done now but schools can appeal those A to F grades when they are issued.

 

She explained that schools received guidance on calculators that seemed different than previous years. Two people in the district called the company separately to verify the information and were told by Pearson to proceed as directed.

 

So when the test began the calculator icon came up on the screen for students who shouldn't have been allowed to use a calculator. Some special education students are provided calculators as an accommodation.

 

Vaughn said two other schools in the district luckily hadn't started testing before the error was realized. Pearson said it is aware of the "isolated issues" having to do with calculator accommodations.

 

"In some cases, Pearson inadvertently provided inaccurate or unclear guidance on the use of calculators during testing. In these instances, we followed up quickly to help local school officials take corrective action," a statement from the company said. "Pearson regrets that any Indiana students, teachers, and schools were impacted by this issue."

 

It affected only 20 schools out of hundreds, including New Haven Middle School and Emmanuel-St. Michael Lutheran School in Fort Wayne Molly Deuberry, communications director for the Department of Education, didn't have an overall number of students affected.

 

The biggest problem came at Rochester schools in Fulton County, where 700 elementary, middle and high school kids mistakenly had access to the calculator. Some used it and others were stopped by individual teachers.

 

Their results have been invalidated. Some sophomores who were specifically affected will need to retake the math portion of the assessment.

 

A Department of Education press release said it is working with school corporation's to evaluate options for limiting the accountability impact.

 

Rochester and other schools may have a high volume of undetermined math results due to the invalidation, which in turn leave proficiency rates and growth scores to be based on a small subset of the overall school population in 2016-17, and student test results from the 2015-16 school year.

 

The department does not have any authority under current statutes to address or rectify this concern. However, the State Board of Education conducts an appeals process for schools that believe the final A-F letter grade does not accurately reflect the school's performance, growth, or multiple measures.

Parents received access to student scores starting this week. Individual appeals can be brought."...


For original post, please see: 

http://www.journalgazette.net/news/local/indiana/20170620/error-invalidates-hundreds-of-istep-math-scores 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Over 100 Education Researchers Sign Statement Calling for Moratorium on High-Stakes Testing, SBAC // California Alliance of Researchers for Equity in Education

Over 100 Education Researchers Sign Statement Calling for Moratorium on High-Stakes Testing, SBAC // California Alliance of Researchers for Equity in Education | "Testing, Testing, 1, 2, 3..." | Scoop.it

"The California Alliance of Researchers for Equity in Education recently released a research brief documenting concerns and recommendations related to the Common Core State Standards Assessments in California (also referred to as the CAASPP, California Assessment of Student Performance and Progress or “SBAC” which refers to the “Smarter Balanced” Assessment Consortium).  A two-page synopsis as well as the full CARE-ED research brief may be downloaded from the main http://care-ed.org website.  The following is an introduction:

“Here in California, public schools are gearing up for another round of heavy testing this spring, including another round of Common Core State Standards assessments. In this research brief, the California Alliance of Researchers for Equity in Education (CARE-ED), a statewide collaborative of university-based education researchers, analyzes the research basis for the assessments tied to the Common Core State Standards (CCSS) that have come to California. We provide historical background on the CCSS and the assessments that have accompanied them, as well as evidence of the negative impacts of high-stakes testing. We focus on the current implementation of CCSS assessments in California, and present several concerns. Finally, we offer several research-based recommendations for moving towards meaningful assessment in California’s public schools.

Highlights of the research brief are available for download here.
The complete research brief on CCSS Assessments is available for download here.”

Background from the 2 page overview includes the following summary of concerns:

  • “The assessments have been carefully examined by independent examiners of the test content who concluded that they lack validity, reliability, and fairness, and should not be administered, much less be considered a basis for high-stakes decision making.
  • Nonetheless, CA has moved forward in full force. In spring 2015, 3.2 million students in California (grades 3-8 and 11) took the new, computerized Math and English Language Arts/Literacy CAASPP tests (California Assessment of Student Performance and Progress). Scores were released to the public in September 2015, and as many predicted, a majority of students failed.
  • Although proponents argue that the CCSS promotes critical thinking skills and student-centered learning (instead of rote learning), research demonstrates that imposed standards, when linked with high-stakes testing, not only de-professionalizes teaching and narrows the curriculum, but in so doing, also reduces the quality of education and student learning, engagement, and success.
  • The implementation of the CCSS assessments raises at least four additional concerns of equity and access. First, the cost of implementing the CCSS assessments is high and unwarranted, diverting hundreds of millions of dollars from other areas of need. Second, the technology and materials needed for CCSS assessments require high and unwarranted costs, and California is not well-equipped to implement the tests. Third, the technology requirements raise concerns not only about cost, but also about access. Fourth, the CCSS assessments have not provided for adequate accommodations for students with disabilities and English Language learners, or for adequate communication about such accommodations to teachers.”…

And the following quote captures a culminating statement:


“…We support the public call for a moratorium on high-stakes testing broadly, and in 
particular, on the use of scientifically discredited assessment instruments (like the current SBAC, PARCC, and Pearson instruments) and on faulty methods of analysis (like value-added modeling of test scores for high-stakes decision making).”…

For the full research brief, including guiding questions and recommendations, please see: http://www.care-ed.org

As of February 2, 2016, the following university-based researchers in California have endorsed the statement.
University affiliations are provided for identification purposes only.

Al Schademan, Associate Professor, California State University, Chico
Alberto Ochoa, Professor Emeritus, San Diego State University
Allison Mattheis, Assistant Professor, California State University, Los Angeles
Allyson Tintiangco-Cubales, Professor, San Francisco State University
Amy Millikan, Director of Clinical Education, San Francisco Teacher Residency
Anaida Colon-Muniz, Associate Professor, Chapman University
Ann Berlak, Retired lecturer, San Francisco State University
Ann Schulte, Professor, California State University, Chico
Annamarie Francois, Executive Director, University of California, Los Angeles
Annie Adamian, Lecturer, California State University, Chico
Anthony Villa, Researcher, Stanford University
Antonia Darder, Leavey Endowed Chair, Loyola Marymount University
Arnold Danzig, Professor, San José State University
Arturo Cortez, Adjunct Professor, University of San Francisco
Barbara Henderson, Professor, San Francisco State University
Betina Hsieh, Assistant Professor, California State University, Long Beach
Brian Garcia-O’Leary, Teacher, California State University, San Bernardino
Bryan K Hickman, Faculty, Salano Community College
Christine Sleeter, Professor Emerita, California State University, Monterey Bay
Christine Yeh, Professor, University of San Francisco
Christopher Sindt, Dean, Saint Mary’s College of California
Cindy Cruz, Associate Professor, University of California, Santa Cruz
Cinzia Forasiepi, Lecturer, Sonoma State University
Cristian Aquino-Sterling, Assistant Professor, San Diego State University
Danny C. Martinez, Assistant Professor, Universityof California, Davis
Darby Price, Instructor, Peralta Community College District
David Donahue, Professor, University of San Francisco
David Low, Assistant Professor, California State University, Fresno
David Stronck, Professor Emeritus, California State University, East Bay
Elena Flores, Associate Dean and Professor, University of San Francisco
Elisa Salasin, Program Director, University of California, Berkeley
Emma Fuentes, Associate Professor, University of San Francisco
Estela Zarate, Associate Professor, California State University, Fullerton
Genevieve Negrón-Gonzales, Assistant Professor, University of San Francisco
George Lipsitz, Professor University of California, Santa Barbara
Gerri McNenny, Associate Professor, Chapman University
Heidi Stevenson, Associate Professor, University of the Pacific
Helen Maniates, Assistant Professor, University of San Francisco
Cynthia McDermott, Chair, Antioch University
Jacquelyn V Reza, Adjunct Faculty, University of San Francisco
Jason Wozniak, Lecturer, San José State University
Jolynn Asato, Assistant Professor, San José State University
Josephine Arce, Professor and Department Chair, San Francisco State University
Judy Pace, Professor, University of San Francisco
Julie Nicholson, Associate Professor of Practice, Mills College
Karen Cadiero-Kaplan, Professor, San Diego State University
Karen Grady, Professor, Sonoma State University
Kathryn Strom, Assistant Professor, California State University, East Bay
Kathy Howard, Associate Professor, California State University, San Bernardino
Kathy Schultz, Dean and Professor, Mills College
Katya Aguilar, Associate Professor, San José State University
Kevin Kumashiro, Dean and Professor, University of San Francisco
Kevin Oh, Associate Professor, University of San Francisco
Kimberly Mayfield, Chair, Holy Names University
Kitty Kelly Epstein, Doctoral Faculty, Fielding Graduate University
Lance T. McCready, Associate Professor, University of San Francisco
Lettie Ramirez, Professor, California State University, East Bay
Linda Bynoe, Professor Emerita, California State University, Monterey Bay
Maren Aukerman, Assistant Professor, Stanford University
Margaret Grogan, Dean and Professor, Chapman University
Margaret Harris, Lecturer, California State University, East Bay
Margo Okazawa-Rey, Professor Emerita, San Francisco State University
Maria Sudduth, Professor Emerita, California State University, Chico
Marisol Ruiz, Assistant Professor, Humboldt State University
Mark Scanlon-Greene, Mentoring Faculty, Fielding Graduate University
Michael Flores, Professor, Cypress College
Michael J. Dumas, Assistant Professor, University of California, Berkeley
Miguel López, Associate Professor, California State University, Monterey Bay
Miguel Zavala, Associate Professor, Chapman University
Mónica G. García, Assistant Professor, California State University, Northridge
Monisha Bajaj, Associate Professor, University of San Francisco
Nathan Alexander, Assistant Professor, University of San Francisco
Nick Henning, Associate Professor, California State University, Fullerton
Nikola Hobbel, Professor, Humboldt State University
Noah Asher Golden, Assistant Professor, Chapman University
Noah Borrero, Associate Professor, University of San Francisco
Noni M. Reis, Professor, San José State University
Patricia Busk, Professor, University of San Francisco
Patricia D. Quijada, Associate Professor, University of California, Davis
Patty Whang, Professor, California State University, Monterey Bay
Paula Selvester, Professor, California State University, Chico
Pedro Nava, Assistant Professor, Mills College
Pedro Noguera, Professor, University of California, Los Angeles
Penny S. Bryan, Professor, Chapman University
Peter McLaren, Distinguished Professor, Chapman University
Rebeca Burciaga, Assistant Professor, San José State University
Rebecca Justeson, Associate Professor, California State University, Chico
Rick Ayers, Assistant Professor, University of San Francisco
Rita Kohli, Assistant Professor, University of California, Riverside
Roberta Ahlquist, Professor, San José State University
Rosemary Henze, Professor, San José State University
Roxana Marachi, Associate Professor, San José State University
Ruchi Agarwal-Rangnath, Adjunct Professor, San Francisco State University
Scot Danforth, Professor, Chapman University
Sera Hernandez, Assistant Professor, San Diego State University
Shabnam Koirala-Azad, Associate Dean and Associate Professor, University of San Francisco
Sharon Chun Wetterau, Asst Field Director & Lecturer, CSU Dominguez Hills
Sumer Seiki, Assistant Professor, University of San Francisco
Suresh Appavoo, Associate Professor, Dominican University of California
Susan Roberta Katz, Professor, University of San Francisco
Susan Warren, Director and Professor, Azusa Pacific University
Suzanne SooHoo, Professor, Chapman University
Teresa McCarty, GF Kneller Chair, University of California, Los Angeles
Terry Lenihan, Associate Professor and Director, Loyola Marymount University
Theresa Montano, Professor, California State University, Northridge
Thomas Nelson, Doctoral Program Coordinator, University of the Pacific
Tomás Galguera, Professor, Mills College
Tricia Gallagher-Geurtsen, Adjunct Faculty, University of San Diego
Uma Jayakumar, Associate Professor, University of San Francisco
Ursula Aldana, Assistant Professor, University of San Francisco
Valerie Ooka Pang, Professor, San Diego State University
Walter J. Ullrich, Professor Emeritus, California State University, Fresno
Zeus Leonardo, Professor, University of California, Berkeley

_______________________

California Alliance of Researchers for Equity in Education. (2016). Common Core State Standards Assessments in California: Concerns and Recommendations. Retrieved from http://www.care-ed.org.

 

##

CARE-ED, the California Alliance of Researchers for Equity in Education, is a statewide collaborative of university-based education researchers that aims to speak as educational researchers, collectively and publicly, and in solidarity with organizations and communities,to reframe the debate on education. 

___________________________________

 

For main post, see: 

http://eduresearcher.com/2016/03/16/sbac-moratorium/ 

 

For Washington Post coverage of the document, see: 
https://www.washingtonpost.com/news/answer-sheet/wp/2016/03/16/education-researchers-blast-common-core-standards-urge-ban-on-high-stakes-tests/

For related posts on EduResearcher, see here, here, and here.
For a collection on high-stakes testing with additional research and updates, visit “Testing, Testing, 1,2,3…”
http://bit.ly/testing_testing 

 

more...
No comment yet.
Rescooped by Roxana Marachi, PhD from "Testing, Testing, 1, 2, 3..."
Scoop.it!

Open Letter to the CA State Board of Education on Release of [False] "Smarter Balanced" Scores

Open Letter to the CA State Board of Education on Release of [False] "Smarter Balanced" Scores | "Testing, Testing, 1, 2, 3..." | Scoop.it

"Dear Members of the California State Board of Education,

Last Spring, 3.2 million students in California (grades 3-8 and 11) took the new, computerized Math and English Language Arts/Literacy CAASPP tests (California Assessment of Student Performance and Progress). The tests were developed by the SmarterBalanced Assessment Consortium, and administered and scored by ETS (Educational Testing Service). Costs are estimated at $360 million dollars in federal tax dollars and $240 million dollars in state funds for 3 years of administration and scoring.
 

Despite the documented failure of the assessments to meet basic standards of testing and accountability, [invalid] scores are scheduled to be released to the public on September 9th.  According to media reports, the 11th grade scores will be used for educational decision-making by nearly 200 colleges and universities in six states. For detailed documents, see Critical Questions about Computerized Assessments and SmarterBalanced Test Scores, the SR Education SBAC invalidation report, the following video, and transcript provided here.
 

At the September 2nd, 2015 State Board of Education meeting, you heard public comment from Dr. Doug McRae, a retired test and measurement expert who has for the past five years communicated directly and specifically to the Board about validity problems with the new assessments.  He has submitted the following written comments for Item #1 [CAASPP Update] at the latest meeting and spoke again about the lack of evidence for validity, reliability, and fairness of the new assessments."...

 

For full post, click on title above or here:

http://eduresearcher.com/2015/09/08/openletter/ 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Got "Grit"? Maybe... // Phi Delta Kappan

 

Shared with permission from author. To download, click on title above or link below. Full April 2017 edition of Phi Delta Kappan, including this article is also available at: http://www.kappanonline.org/april-2017-table-contents/ 

 

Duckor, B. (2017). Got grit? Maybe... .Phi Delta Kappan, 98, (6), 61-66. Available online at https://www.pdkmembers.org/members_online/publications/archive/pdf/PDK_98_7/61pdk_98_7.pdf

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Despite Warnings, College Board Redesigned SAT in Way That May Hurt Neediest Students // Reuters

Despite Warnings, College Board Redesigned SAT in Way That May Hurt Neediest Students // Reuters | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Renee Dudley  
"Part Six: Internal documents show the makers of the new SAT knew the test was overloaded with wordy math problems – a hurdle that could reinforce race and income disparities. The College Board went ahead with the exam anyway." 
[Picture caption] SOUNDED ALARM: Dan Lotesto, who teaches at the University of Wisconsin-Milwaukee, helped the College Board review potential questions for the redesigned SAT and warned that many questions were sloppy or too long – and that test-takers would suffer. REUTERS/Darren Hauck

 

"NEW YORK – In the days after the redesigned SAT college entrance exam was given for the first time in March, some test-takers headed to the popular website reddit to share a frustration.

 

They had trouble getting through the exam’s new mathematics sections. “I didn’t have nearly enough time to finish,” wrote a commenter who goes by MathM. “Other people I asked had similar impressions.”

 

The math itself wasn’t the problem, said Vicki Wood, who develops courses for PowerScore, a South Carolina-based test preparation company. The issue was the wordy setups that precede many of the questions.

 

“The math section is text heavy,” said Wood, a tutor who took the SAT in May. “And I ran out of time.”

 

The College Board, the maker of the exam, had reason to expect just such an outcome for many test-takers.

 

When it decided to redesign the SAT, the New York-based not-for-profit sought to build an exam with what it describes as more “real world” applications than past incarnations of the test. Students wouldn’t simply need to be good at algebra, for instance. The new SAT would require them to “solve problems in rich and varied contexts.

 

But in evaluating that approach, the College Board’s own research turned up problems that troubled even the exam makers."...

 

----------------

  • Part One: College Board gave SATs it knew were “compromised”

  • Part Two: Despite tighter security, new SAT gets hacked

  • Part Three: Chinese cheating rings penetrate U.S. colleges

  • Part Four: Widespread cheating alleged in program owned by ACT

  • Part Five: Breach exposes questions for upcoming SAT exams


For full post and links to Parts 1-5 of the Investigative series, click on title above or here: http://www.reuters.com/investigates/special-report/college-sat-redesign/ 

 

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Student Data Being Stored and Swapped Among Many Agencies // By Morgan Boydston, KTVB [Click on title for full news video/report]

Student Data Being Stored and Swapped Among Many Agencies // By Morgan Boydston, KTVB [Click on title for full news video/report] | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Morgan Boydston, KTVB
BOISE - How would you feel if you found your child was being tracked from the minute you registered them for kindergarten, until they enter the work force? Idaho has several agreements that allow and require them to do just that. Many parents don't realize that their student's personal information is being collected and shared at the state and federal level, among many different agencies.


KTVB talked with concerned parents, as well as the State Department of Education to get to the bottom of why students' data is being stored and shared.


"I believe our youngest, most vulnerable citizens probably should have the most protection of privacy," said Mila Wood, a concerned mother and spokesperson for Idahoans for Local Education.
 

From the very beginning of the school day, children are shedding data. From the bus stop to the classroom laptop, hundreds of data points are being collected by state, corporate and federal agencies.


"They collect everything, they absolutely collect everything," Wood added. "Actually one of the very first items that kind of brought my attention was this little card in my son's wallet when he was in eighth grade and it's an Idaho Department of Labor card." 


Stacey Knudsen is another parent active in finding out how, where and why her children's personal information is being stored. Sensitive information attached to their individual student ID numbers such as disciplinary actions, meal choices, socioeconomic status and much, much more.


"This information is really sensitive," Wood added.

"When we talk about keeping kids safe, and their data, that's extremely important," said Jeff Church, spokesperson for the Idaho Department of Education.


The Idaho State Board of Education is constitutionally responsible for supervising public education from kindergarten through college.

For that reason, a state-wide data system was created to evaluate and improve the process by which a student moves through the education system in Idaho. The Board works in conjunction with the State Department of Education, which tracks K-12 data.


"We only collect the data that we truly need, whether it be for federal reporting, state reporting or financial calculations and payments out to school districts," Church said. "Over the last year we have gone through a process of removing upwards of 200 data elements within that system."


Department officials say they have been working to collect a lot less data than they used to by asking the question: Do we need the data?

"If we don't for federal or state or financial calculations, we don't need it and we don't want it," Church added.


But parents say they are still concerned because the Department of Education still collects 390 elements and many of those elements are alarming.


"There's certainly not a need for us to be storing the amount of data that we're storing," said Knudsen.


Church argues that the aggregate academic information, like test scores, is crucial for policy-making decisions and measuring Idaho's success compared to other states.


"Seeing the data and how students across the state are doing on math informs the superintendent on policy decisions to say we need to make a change and move toward what works," he said.

Concerned parents believe the problems stem from personally identifiable information that other state, federal and private agencies have access to.


"Where is this information going? Who is utilizing my child's psychometric data?" Wood asked.


Parents also wonder why they are not given the option to give, or deny, consent for the data.


"Nobody can really give us a clear picture of who is accessing and how they're keeping that data safe," Knudsen said.


To protect that data, there is a federal law in place called the Family Educational Rights and Privacy Act, or FERPA. But activists say it's become too relaxed over the last several years. In 2014, Idaho enacted its own student privacy law. The Board of Ed says the state-appointed Data Management Council does not allow a free-flow of information because the council oversees any requests to get ahold of any data.


"At no time is an individual student's data utilized for decision-making purposes or for individual purpose of any kind," Church added.


The department shares group and personal data with many state and federal departments, as well as private companies, including, but not limited to:

  • The Department of Health and Welfare
  • Federal Education Facts
  • Smarter Balanced Consortium
  • Title I Student Counts
  • Migrant Student Information Exchange
  • ISAT, College Board
  • Data Recognition Corp
  • Individual Student Identifier for K-12 Longitudinal Data System
     

"So they are all sharing the data together within our state longitudinal data system," Wood said.


They also share with the State Board of Education, which has agreements with other state and federal agencies such as the Department of Juvenile Corrections, Department of Labor, Department of Transportation, and National Student Clearinghouse.

Knudsen and Wood have plenty of advice for parents that are just finding out about this phenomenon.
 

"What you think is just between you and the teacher and the school, that's no longer the case," Knudsen said. "Be a little more wary of what you fill out, and really read through the documents that you're signing at school."
 

Church says parents can contact the the Department of Education and ask to see their child's personal data. Parents must file a public records request, and then meet with a representative in person.
 

The Board of Education says parents also have the option to go directly to their child's school and request to see the data there, at the source."


For full post including news video coverage, click on title above or here: 
http://www.ktvb.com/news/investigations/7-investigations/student-data-being-stored-and-swapped-among-many-agencies/50092079 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

How The SAT and PSAT Collect Personal Data On Students — and What The College Board Does With It // Washington Post 

How The SAT and PSAT Collect Personal Data On Students — and What The College Board Does With It // Washington Post  | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Valerie Strauss
"If your child takes the SAT or PSAT, is his or her personal information being collected, profiled, licensed and sold?

 

That is the question that Cheri Kiesecker, Colorado parent and member of the Parent Coalition for Student Privacy, asks and attempts to answer in the following important post. The Parent Coalition for Student Privacy is a national alliance of parents and advocates defending the rights of parents and students to protect their data.

 

The SAT has traditionally been used as a college entrance exam but it, and the ACT, also a college entrance exam, are increasingly being used as high school tests. In fact, 25 states now require that high school students take them for school accountability purposes, Education Week reported here.

 

The protection of personal data is in the news with the recent passage by Congress of legislation that eliminates landmark online privacy protections established by the Obama administration. It removes limits that had been placed on Internet service providers —  such as AT&T, Comcast and Verizon — on how they can use data they collect on their customers, including browsing habits and Social Security numbers. Privacy advocates are especially concerned with how this will affect young people."..

 

 

For full post, see: https://www.washingtonpost.com/news/answer-sheet/wp/2017/03/30/how-the-sat-and-psat-collect-personal-data-on-students-and-what-the-college-board-does-with-it/?utm_term=.6ac82597d1af 

 

[Image adapted from Janneke Staaks v Creative Commons Flickr]

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Researchers Protest Use of SBAC [CAASPP] For Reclassifying English Learners 

Researchers Protest Use of SBAC [CAASPP] For Reclassifying English Learners  | "Testing, Testing, 1, 2, 3..." | Scoop.it

See main post here https://edsource.org/2017/researchers-advocates-divided-over-reclassifying-english-learners/582175 

 

To download a copy of the researchers' letter, see here: https://www.documentcloud.org/documents/3723760-EL-RECLASS-SB463-LinquantiLet042817.html 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Resolution Opposing High-Stakes Teacher Candidate Performance Assessments // National Council of Teachers of English 

Resolution Opposing High-Stakes Teacher Candidate Performance Assessments // National Council of Teachers of English  | "Testing, Testing, 1, 2, 3..." | Scoop.it

Approved by NCTE Members Voting at the Annual Business Meeting for the Board of Directors and Other Members of the Council, November 2016

 

Ratified by a Vote of the NCTE Membership, February 2017

Background
High-stakes teacher candidate performance assessments are a serious, imminent threat to the integrity of the field of English Education and to the teaching profession as a whole. High-stakes performance assessments are being used in at least 710 teacher education programs across thirty-nine states. Additionally, passing a high-stakes assessment is required or under consideration for licensure in sixteen states and could become a requirement for teacher licensure nationwide.

 

There are widespread concerns regarding the lack of predictive validity and the privatization of teacher candidate performance assessments as well as the potential for disparate impact of these assessments on preservice English educators and their students.

 

  • Unlike university-based evaluations of candidates’ performance, through which multiple stakeholders assess candidates’ growth over time, high-stakes performance assessment scorers often determine competency based upon a limited number of lessons in candidates’ self-curated portfolios.

  • High-stakes performance assessments diminish the quality of preservice teacher education by shifting the focus of student teaching from student learning and professional growth to a single assessment.

  • Because they are standardized and delocalized, high-stakes performance assessments undermine the abilities of candidates to develop the skills necessary to meet the unique, situated needs of students in their classrooms. 

  • By privileging a singular, culturally hegemonic construction of effective teaching, high-stakes performance assessments inhibit the preparation of candidates to be culturally and contextually responsive teachers who are ready to respond to the unique needs of learners.

  • High-stakes performance assessments privilege a narrow definition of academic language in the classroom. 

  • Unanswered questions regarding the use, transmission, and storage of high-stakes performance assessment video clips create privacy concerns that put K-12 students and their teachers at risk.


High-stakes teacher candidate performance assessments do not meet NCTE’s characteristics of a fair, effective, and successful system of teacher evaluation (“Position Statement on Teacher Evaluation,” 2012) and violate NCTE’s resolutions on testing and social justice in literacy education (“Resolution on the Students’ Right to Their Own Language,” 1974; “Resolution on Testing,” 1995; “Resolution on Social Justice in Literacy Education,” 2010; “Resolution on Student Educational Data Privacy and Security,” 2015). Additionally, high-stakes performance assessments violate CEE’s position statements on social justice and diversity in language arts education (“Beliefs about Social Justice in English Education,” 2009; “Supporting Linguistically and Culturally Diverse Learners in English Education,” 2008). Be it therefore

Resolved, that the National Council of Teachers of English

  • strongly oppose legislation mandating the requirement that candidates pass high-stakes teacher performance assessments as a requirement for licensure;

  • strongly oppose the use of standardized high-stakes assessments during candidates’ student-teaching experiences; and

  • encourage its members to engage in critical scholarship and teaching about teacher candidate performance assessments."

 

http://www.ncte.org/positions/statements/teacher-cand-perf-assess 

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

State Failed At Assessing Standardized Testing // Register Guard // Roscoe Caron, Register-Guard

State Failed At Assessing Standardized Testing // Register Guard // Roscoe Caron, Register-Guard | "Testing, Testing, 1, 2, 3..." | Scoop.it

By Roscoe Caron

"House Bill 2713, passed in 2013, directed the secretary of state’s office to conduct an audit of the costs of high-stakes standardized testing — a homework assignment of assessing the assessments.

 

It was a noble effort by HB 2713’s author, Rep. Lew Frederick, D-Salem, to get a handle on the testing industry in Oregon, but the fingerprints of the state Department of Education, the state testing bureaucracy and testing advocate corporate “reform” groups such as Stand for Children and the Chalkboard Project are all over the secretary of state’s report.


First, the audit interviews were conducted overwhelmingly with testing advocates. Of the 22 organizations that were consulted, only four groups had a history of seriously questioning these tests. Thirteen groups were associated with testing advocacy organizations or with the profitable testing business and bureaucracy.

In addition, multiple interviews were conducted with the education department’s assessment personnel and their boss, Salam Noor, Oregon’s de facto superintendent and an ardent supporter of high-stakes testing.


To top it off, the leaders of the high-stakes testing creator, Smarter Balanced Assessment Consortium, were interviewed. Let’s guess how objective they were.

Other voices were not seriously solicited. Site visits were conducted at only six schools. That’s 0.5 percent of Oregon’s 1,200 schools. Only 577 parent responses represented Oregon’s 567,000 students. Administrators from 48 percent of Oregon’s school districts responded, but Eugene School District administrators said they were never contacted.

 

A small number of disability rights and civil rights groups were interviewed. Their concerns are valid, but they are really not players in the high-stakes, big money arena of Ore­gon testing and corporate “reform.”

That the fix was in was telegraphed on the very first pages of the secretary of state’s audit report, reiterating all of the major marketing phrases that have been used repeatedly to sell these tests to politicians, journalists, education bureaucrats, school boards and parents.

These selling points have been debunked time and again by independent researchers and by the teachers who know full well how severely limited these tests are. Numerous researchers and studies show that these tests are fundamentally invalid, provide little useful information and are skewed to the advantage of those students from college-educated families with access to a host of resources.

The audit was supposed to identify “fiscal, administrative and education impacts” of state testing. Really? The audit says these tests cost $10.2 million — interesting, since former Oregon Deputy Superintendent Ron Saxton estimated the costs at $27 million.

The audit’s listed cost figure is only what the state pays to the American Institutes for Research for statewide logistics and to the Smarter Balanced Assessment Consortium for “membership fees.” This conveniently ignores costs such as the huge testing bureaucracy within the Oregon Department of Education.

 

What other impacts are ignored? How about the costs in teacher training and staff meeting time, school-based test coordinators, test-associated practice materials, and the multitude of district and school-based testing coordinators?

 

Never mind the nonmonetary costs of ruthlessly tracking low-income kids from an early age; turning libraries and computer labs into testing centers for months and driving nontested art, music, social studies and electives from our schools. How about the cost of driving out of teaching the creative, dynamic teachers who can no longer stomach what is happening to public education?

HB 2713 also mandated examining “potential problems” and then making “recommendations.” Now the report becomes surreal. Stacking the deck against kids with disabilities and kids who don’t speak English? Low-income kids and schools being unfairly targeted as “failures”? Tests eating up scarce money from district budgets? Excess student anxiety in the classroom? Teachers feeling like cogs in a multibillion-­dollar testing machine? Growing public mistrust of that testing machine?


The audit’s recommendation: Spend more state money on marketing and PR! Do a better job selling these tests to the public: “Identify and expand communication efforts ... .”

Wait, there’s more. To the criticism that the late spring tests are worthless in giving useful information to teachers, the state audit recommends — get this — overturning House Bill 5008, which limits state-administered tests to only end-of-the-year assessments. Then what? The audit recommends we “... expand the use of formative and interim assessments ... .” Great! Smarter Balanced standardized tests — all year long!

This is where we are. The state testing bureaucracy and its corporate “reform” friends are determined to have more standardized tests and to sell their vision to local districts and parents. Most of our local elected representatives and school boards can’t bring themselves to say no.

It’s up to parents and students to stop this madness. Opt out this year."

Roscoe Caron, a former middle school teacher, teaches at the University of Oregon and is a member of the Community Alliance for Public Education."

 

 

 

 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

Poet: I Can’t Answer Questions On Texas Standardized Tests About My Own Poems // Washington Post

Poet: I Can’t Answer Questions On Texas Standardized Tests About My Own Poems // Washington Post | "Testing, Testing, 1, 2, 3..." | Scoop.it

https://www.washingtonpost.com/news/answer-sheet/wp/2017/01/07/poet-i-cant-answer-questions-on-texas-standardized-tests-about-my-own-poems/ 

more...
No comment yet.
Scooped by Roxana Marachi, PhD
Scoop.it!

After July 2017, UCLA Will No Longer Oversee Smarter Balanced // Dr. Mercedes Schneider 

By Mercedes Schneider
"In February 2016, the Partnership for Assessment of Readiness for College and Career (PARCC) posted a Request for Information (RFI) in which it formally solicited advice on how it should proceed as a Common Core testing consortium.

The result of that PARCC RFI was 128-page response by a number of organizations, which I wrote about here.

One of the suggestions for PARCC’s future concerns its being overseen by the other Common Core consortium, Smarter Balanced. Interestingly, Smarter Balanced, which included its own advice in response to PARCC’s RFI, did not itself offer to oversee PARCC.

What Smarter Balanced offered in its advice for PARCC were details on how Smarter Balanced successfully operated as a consortium.

 

That is why it is surprising that  EdWeek’s Sean Cavanagh reports that Smarter Balanced will be seeking a new fiscal agent. Smarter Balanced has a contract with the University of California system, and UCLA has been serving as the Smarter Balanced fiscal agent. UCLA’s 3-year contract for this role expires on June 30, 2017.

On September 28, 2016, UCLA notified Smarter Balanced to inform the consortium that UCLA was not interested in continuing to oversee Smarter Balanced. Cavanagh reports that Smarter Balanced is in negotiations to seek another university in the University of California system to oversee Smarter Balanced.

UCLA notes that it will continue to “focus on scholarly work and new research in coordination” with Smarter Balanced, but UCLA will not run the consortium.

 

Of course, the very fact that Smarter Balanced will be transitioning to a new fiscal agent means that its future stability is in question.

So, America, we have two Common Core testing consortia, both of which face questionable stability– which also points to questionable sustainability.

Of course, PARCC and Smarter Balanced both need states as consortium members in order to survive.

 

It remains to be seen how states will respond to the fact that both PARCC and Smarter Balanced are experiencing their own internal struggles."...

 

For full post, click on title above or here: 

https://deutsch29.wordpress.com/2016/10/19/come-june-2017-ucla-will-no-longer-oversee-smarter-balanced/ 

 

more...
No comment yet.