COMMON-CORE-ALIGNED SAT NOT READY UNTIL 2016
By Donna Garner
[12.6.13 — David Coleman, who has never taught a day in his life in K-16, was the main writer of the Common Core Standards for English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects — referred to as the Common Core Standards for English.
David Coleman is now the head of The College Board. The College Board has just announced that the new SAT (aligned with the Common Core Standards) will not be administered until 2016. The PSAT/NMSQT® (aligned with the Common Core Standards) will not be administered until the fall of 2015.
The good news is that the elections of 2014 are coming; and hopefully if voters elect the right kind of people to Congress who are savvy about the dangers of the Common Core Standards, the Obama administration’s entire plan to take over the public schools through Common Core Standards will be stymied.
I have posted underneath the Washington Post article an explanation about the importance of the writing section on the present SAT. The SAT writing section has 49 grammar/usage questions (with right-or-wrong answers), and 70% of the score on the writing section is based upon the grammar/usage questions. The essay score accounts for the other 30%. The research done by the SAT shows that the biggest predictor of success for college freshman is the score on the SAT’s writing section, 70% of which is based upon a student’s ability to use English grammar/usage correctly.
The conclusion that should be drawn is that if students will learn their English grammar/usage, they will become better writers, thus improving their chances of doing well in their first-year courses in college.
It would be a terrible shame if the College Board under David Coleman were to wipe out the grammar/usage part of the SAT; but I would not be one bit surprised if that were Coleman’s plan. – Donna Garner
12.4.13 – Washington Post
“New SAT delayed to 2016”
eXCERPTS FROM THIS ARTICLE:
The new version of the SAT college admissions exam [aligned with the Common Core Standards] that was due to be unveiled in 2015 is now being delayed until spring 2016.
David Coleman, the president of the College Board, the nonprofit organization that owns the SAT, sent the following e-mail to College Board members about the delay:
…We have made the decision to adjust our schedule for this work and will now release the revised PSAT/NMSQT® in fall 2015 [aligned with the Common Core Standards], followed by the release of the revised SAT® in spring 2016 [aligned with the Common Core Standards].
We heard clearly from our members — including our Board of Trustees, national and regional councils, the SAT committee, attendees at our national Forum, and particularly those in higher education — that you need more time, and we listened.
…This change in the timing of the redesign will serve our members in higher education by providing two years to plan for the redesigned exam, familiarize themselves with changes, and meet system and publication requirements.
…We have also heard the needs of states and districts. The K–12 community has expressed a strong preference for students to be able to take the revised PSAT/NMSQT [aligned with the Common Core Standards] before the revised SAT [aligned with the Common Core Standards]. Releasing the revised PSAT/NMSQT in the fall of 2015 will address this need, and we will continue to communicate with state and district leaders regarding this important work.
…Thank you for your continued support.
Coleman, a co-author of the Common Core English/Language Arts Exams, became president of the College Board in 2012 and said earlier this year that [http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/02/26/sat-exam-to-be-redesigned/ ] the SAT would be rewritten to better meet the needs of students. Coleman has said [http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/01/03/new-college-board-chief-cites-problems-with-sat/ ]
he has a number of problems with the SAT as now written, including with its essay and vocabulary words.
For decades the SAT was the most popular college admissions exam but last year for the first time its rival, the ACT, overtook it. The decision to delay the rollout of the new SAT could give the ACT more time to solidify its position in the college admissions testing contest.
The SAT was first given in 1926. It was revamped less than a decade ago when a written essay was added and some of the question formats were changed.
6.18.08 – From Donna Garner:
Questions: How can students raise their SAT scores to help them get into the colleges of their choice, and how can these students be successful once they get there?
[What a person should conclude from this SAT study just released today is that the Writing section is a very good predictor of students’ first-year college grades. Since 70% of the writing score is based upon grammar/usage questions (essay counts only 30%], then the grammar/usage scores evidently are a good predictor of students’ first-year college grades.
The question should be then, “How do students raise their grammar/usage scores?” Obviously, they need to learn grammar/usage.
New studies provide first-year performance data
The College Board introduced a revised SAT, with an additional writing section [two writing subscores — grammar/usage 70% and essay 30%] and minor changes in content to the verbal and mathematics sections, in March 2005. Colleges and universities across the United States provided first-year performance data for the fall 2006 entering cohort of first-year, first-time students to validate the use of the SAT in college admissions.
Two studies are now available:
- Validity of the SAT for Predicting First-Year College Grade Point Average (.pdf/550K) looks at the overall ability of the SAT to predict performance in the first year of college.
- Differential Validity and Prediction of the SAT (.pdf/565K) evaluates if the SAT is fair and consistent across the key demographic variables of gender, race/ethnicity, and best language.
The final sample included 151,316 students attending 110 colleges and universities.
Results of the SAT Validity Studies
The College Board research studies analyzed the data submitted by the 110 colleges that participated in the College Board’s Admitted Class Evaluation Service (ACES). These colleges received their ACES study results in the fall and winter of 2007-08. Many other colleges and college systems, such as the University of California system, conducted their own studies. For both the University of California and the College Board studies, the results are similar. Writing is the most predictive section of the SAT, slightly more predictive than either math or critical reading. In the California study, SAT scores were slightly more predictive than high school grade point average (HSGPA). In the College Board analysis of the more than 150,000 students included in all 110 ACES studies, HSGPA was slightly more predictive than SAT scores.
Validity of the SAT for Predicting First-Year College Grade Point Average study
The main analytic method used for this study was the comparison of single and multiple correlations of predictors (SAT scores, HSGPA) with first-year college GPA (FYGPA). All correlations were corrected for range restriction.
The results show that the SAT continues to be a very strong predictor of first-year college performance, and that the changes made to the SAT add to the test’s validity. Read a summary of the key findings. (.pdf/32K)
Differential Validity and Prediction of the SAT study
The purpose of this study was to assess the differential validity and differential prediction of the revised SAT for gender, racial/ethnic, and best language subgroups. Differential validity exists if the magnitude of the test-criterion correlation varies by subgroup. Differential prediction occurs when a test systematically over- or underpredicts the criterion (e.g., FYGPA) by subgroup. The results are similar to prior research indicating that changes to the SAT did not diminish the differential prediction and validity of the test, and the SAT continues to be a fair test for all students. Read a summary of the key findings. (.pdf/49K)
Implications of the studies
Both the College Board and the University of California studies indicate that writing is the most predictive section of the SAT. Colleges not requiring an admissions test with writing, therefore, are overlooking the most useful section of the test and one of the best predictors of college success to which they have access. Writing as a college-level skill is a crucial asset for student success, an important message reinforced by colleges that require admissions tests with a writing section.
In March 2005, the College Board introduced a revised SAT, with an additional writing section and minor changes in content to the verbal and mathematics sections.
The results are similar to prior research indicating that changes to the SAT did not diminish the differential prediction and validity of the test, and the SAT continues to be a fair test for all students…
Among the three individual SAT sections, SAT writing has the highest correlation with FYGPA [first-year college grade point average] (r = 0.51). The correlation is 0.48 for SAT critical reading and 0.47 for SAT mathematics. In fact, SAT writing alone has the same correlation with FYGPA as does SAT critical reading and SAT mathematics taken together…