Coming to work regularly and on time, dressing appropriately and showing respect for company property are obvious employer expectations, no matter what industry you choose your career in.
Nevertheless, employer surveys across the nation reveal businesses are having trouble filling entry-level positions with applicants who possess these characteristics, not to mention the other “soft skills” such as the ability to work through conflict, cope with change, use basic math and read and understand written information.
“Believe it or not, these soft skill components we’re talking about — integrity, punctuality, individual responsibility, ingenuity — those are skills that are unfortunately not focused on in our traditional education model,” Daugherty said.
How does any given report of data or a single test accurately dictate these types of "soft skills" which are absolutely necessary in EVERY industry?
However, I sat through a 2 hour school board meeting last night listening to the madness of how data somehow proves a students worth and success based on a MAP test called the NWEA!! A teachers worth is most certainly a reflection of a students test score which defines what they have learned and what their potential is or isn't! SERIOUSLY?! Principals and teacher alike LOVE the NWEA chart and diagrams because it shows what the students progress is! According to what? Because Microsoft and Bill Gates said so, because it looks pretty and neat on that piece of paper? Because we paid $84,000 it MUST work, essentially $13/per student makes it reliable and therefore factual and truthful.
Have you taken an actual NWEA sample test, the adaptive model? Have you seen how kids figure out ways to manipulate the system, it is student directed on a computer, as you know. Did you know ALL DATA can be scewed? Where does this data go? Have you seen how subjective these NWEA questions are? How well do you as an adult answer subjective multiple choice questions? How well does a 5 year old do on a self directed computer aptitude test? Did you know there are multiple variables that can mess up the data? What mood was that child in that day? What happened that morning? What they ate or didn't eat for breakfast? What kind of dispute they got into with their friends or family just before sitting down to take that computerized benchmark? Whether or not they got a good nights sleep? Whether or not they are sad becuase their grandfather passed away 3 weeks ago and their mind is still on the funeral? LIFE HAPPENS, children are children, teachers truly know each child and a computer does not EVER see the whole entire picture!! Why are we relying solely on data to tell us what a child is worth? Because it's easy, because it looks good, because we have bought into the $84,000 model of propoganda and attended a seminar that told us it's the best thing since sliced bread? When did we accept the mindset of students as a number, students as profit, students as HUMAN CAPITAL and what that data sheet tells us?
Just because my daughter reads fluently off the charts does not determine if she is a model reader! Did you know she usually can't remember hardly anything she just read? She's too busy reading it as fast as she can, sounds good, is smooth and concise yet she barely remembers bits and pieces, however, she blows these benchmark tests out of the water!!! Every night I read with her and she is typically confused and we have to go back and slow down and REALLY read the words to learn and understand not just READ!!
My son on the other hand, he is very slow, very analytical, NEVER does well on these fluency tests or benchmarks, he takes his time, is choppy, very intuitive to every aspect and angle when he reads. He has progressed and gotten better at taking these ridiculous tests mainly because he figured out how to manipulate the system, don't answer the questions right to start so the questions don't continue to get harder! Too smart for his own good and a huge disservice to the data model! Nobody catches on, he slides through, tells me during chit chat at night and I understand he has already analyzed the system. He says, "I'm very good at it anyways, so I might as well not put forth my best effort. Nobody REALLY pays attention." As a mom he's absolutely right, it just gets filed away somewhere as "proof" and it's not really that great of an indicator, just another dumb report, so seriously who really gives a whoot?
Well I do, as a mother and a taxpaying citizen that sees these students stereotyped and/or labeled based on their test scores which both of my two children are proof that they are NOT valid or accurate! Why would we continue to spend $84,000 on this type of program? Here are 10 reasons why every principal and teacher should be asking more questions and digging into this nationalized NWEA benchmark testing...and where is the FUNDING going to come from? Why not just pay our well deserving teachers more? $12-$13 per student?
10 Reasons Why Our District Should Not Renew the NWEA MAP Testing
1. More lost class time and lost opportunity for learning. Students again on a computer taking yet another test. We only get 180 days as it is, why waste valuable class time and teaching opportunities?
2. It's EXPENSIVE and costly!! It is a licensed product with ongoing costs and that take more $$$ away from the classroom. Funding is already an issue.
3. This test is often misused to evaluate teachers. Data and graphs essentially replace the teacher, as if you as a teacher have no input and/or value. Teachers are being told that their students' MAP test scores are a reflection of their teaching and often does not even align with what they have taught or are teaching.
4. Excessive. MORE TESTING! This test given 2-3 times per year can take anywhere from 15 minutes to 1 hour and is additional testing.
5. MAP tests narrow the curriculum. If a teacher only feels pressure to raise test scores, long-term what ends up happening is the teacher dilutes and distorts teaching the whole child and instead "teaches to the test."
6. The data is inappropriate and unreliable especially for K-2 students. How can a 5, 6 or 7 year old be expected to self-guide their way through their own adaptive test on a computer when they are still learning and mastering letter sounds and reading? To test on a computer you would have to assume the child is a mastered reader first?! What happens when they get to a word they don't know, can't comprehend or can't sound out? FAIL, FAIL, FAIL --- that's why we call it learning and struggling is not necessarily failing but according to standardization it is. Why label a struggling child that is still quite capable of learning? Maybe it hasn't clicked for him/her yet, these tests deflate rather than inflate or encourage.
7. MAP does not take into consideration ELL or ESL students.
8. MAP is very limited when it comes to Advanced or Accelerated students. There is a quick ceiling that a students hits on these tests. How can it measure significant growth in children that are already at the ceiling, and what happens when an already high score slightly decreases from a 99 to a 98? There is no calculation for it and it is very negative despite the fact the student is off the chart, any negative feedback and the child is labeled. Very unreliable for above average and below average student levels. You can not compare apples to oranges and because the test results fluctuate so much and the students do not get the same exact questions there is no common denominator. Essentially you can not compare any of the aggregate data as a whole and often the testing company recalibrates the percentiles.
9. MAP is completely unnecessary. There are MANY alternatives to MAP testing which cost less, or are free and are also less time consuming that teachers can administer. Good teachers already do this anyways and differentiate their lessons for students as needed. Teachers do not truly need data models or a computerized test to prove what they already know and whether or not a student is progressing.
10. The manner in which the MAP test product was selected and purchased is questionable. How many other vendors did the school board request or receive bids on for RIT/MAP testing? What process and motivation was utilized to approve the costs associated with this data driven model? There are many other local models that can and should also be considered. How many other bids did our school district receive?
Overall this test is designed to give children questions that are at times too difficult for them. Why are we asking them questions about things that they have never studied?
From a former school district that recently got rid of the NWEA testing:
The data we get from NWEA can only be valuable to a point, but honestly, you can only do so much with that data. Mostly, as teachers and a board we have come to the conclusion that the data from NWEA is just that, data! The data only confirms what we already know. Heck, teachers work with these students on a daily basis. Teachers have NEVER lacked authentic data, just formal, organized, chartable data that can be inputted and entered into a system which the NWEA gives. But we all know it's not about the data, it has and never should be all about the data. It's about how to reach these kids, how to engage our students, expecially those at the bottom of the barrel, those struggling learners. And you can't do that simply by reviewing reports of data.
Bottom line: a student and teacher are more than a test score and the NWEA mapping is NOT worth the money we spend on it!
And as I heard last night demographics and data can be pulled up in an instant, anything you want to know can be utilized from 3rd party vendors such as Business Information Services LLC. All data can be scewed and or used in many different variables it just depends on how you want it to look. And I understand why now!!! Because it's a whole lot easier to pay the additional expense in order to have somebody else (typically out of state and out of mind) in order to point the finger at and lay the blame on, whether it be testing, standards or re-districting!!
Imagine now just for a second if I put you in a car and we drove down the road in order to test your driving skills and a cat or dog ran out in front of you. You SWERVE and you fail your MAP data model adaptive test -- NO MORE DRIVERS LICENSE FOR YOU -- oops!! Sorry about that...better luck next time, this computer model proves that you my friend are a horrible and unsafe driver. :(
http://www.commondreams.org/headline/2014/02/26-8
http://www.nationofchange.org/myth-behind-public-school-failure-1393173355
http://www.capitolhilltimes.com/2014/02/map-protest-leader-runs-teachers-union-president/
No comments:
Post a Comment