Education in the Old and New World
The pre-computer world and the post-computer world obviously need very different skills. This has meant that education itself has been split wide open, leading to much confusion and difficulty in setting and achieving goals. If the very definition of knowledge has changed, then how can we best educate someone? Do we decide what facts students should learn, teach those facts, and then test to see how we've done? Or should we help students to set their own learning goals, train them to find the information they need, and then help them demonstrate their learning? These are very, very different objectives. And yet, if you look carefully into school districts, school buildings, and even within individual classrooms you'll find a kind of schizophrenic attempt at doing bits of both of these things, with very disheartening results. When one sets off on a trip and then changes direction every 100 feet, one isn't likely to arrive at a desirable destination, or indeed travel very far at all.
Pre-computer education was built on several fallacies. First, there was an assembly-line mentality that all students are alike, that all can do the same things, that all SHOULD do the same things. This makes little sense as the world gets more and more specialized, but it is now being enforced as never before through all the standardized tests included in the No Child Left Behind laws. Another fallacy is that there is an orderly and predictable route that learning happens in. Thus "pre-reading" skills are religiously taught, regardless of the fact that millions of children (including my own) learn to read without ever having been taught them. A third fallacy is that if one simply memorizes enough things they'll succeed in the world. In reality, more people are fired from jobs for not being able to work with other people (something left out the NCLB altogether) than for not having some kind of knowledge of facts. Last but not least, there is a belief that somewhere, some collection of people knows exactly what everyone in the country should know, so there is a way to design a test that will measure if one is an "educated person". When Minnesota held hearings on the Social Studies curriculum this assumption was thoroughly tested, as the extremely long list of competencies that had been designed by the "experts" was shown to be extremely biased- and also, it had nothing in it about knowing about current political issues, how to choose candidates, or even the basic skills about how to vote! Each curriculum area is like this- there is much controversy, even among experts in different fields, about what comprise the basic knowledge in each area.
Post- computer education would be entirely different, and as such it is difficult to even imagine how it would look. For one thing, emphasizing the skills of finding information rather than memorizing it presents the possibility that students get to decide what to research. Also, since students often know more about the Internet than their teachers, there would be more of a collegial atmosphere than a dictatorial one, where teachers help pose questions or problems and then all parties do some investigation, sharing their findings equally. This goes for hardware and software too- those of us who've worked in schools already know that the techie kids are a necessary component of our school's computer department, and they often know more than the people hired to supervise them. Demonstrating learning won't be done through testing, since everyone will be learning about different topics. So instead there will be student presentations, powerpoints, essays, even books or movies that show the meaningful things (to them and hopefully to others) that they have investigated. These can be shared with others inside the school, but also to the outside community via websites or real-time presentations in person or online. Rather than going on to more generalized training such as college, more students will choose to specialize from the start, and they may either launch their own business or go to specialized schools or online classes to learn just what they need to know to take the next step in their chosen field, whether it be hair design, international negotiations, or quantum physics.
In schools today there are trends toward both worlds. The pre-computer direction is yielding standardized curricula (most textbook companies are now advertising that they are "standards based", meaning that they'll teach to the tests), and teachers, schools and districts are being let go or taken over if their students' test scores aren't high enough. This is actually leading to more "throw-away kids", because the districts don't mind if low-achieving kids drop out, since those are the very ones who lower the district's test scores. Even community colleges for hands-on training such as carpentry or cooking are requiring students to pass tests in order to get in. At the same time, some schools or teachers are asking for creativity and original research projects, assuming that this new world will require all the adaptability their students can come up with. Service learning is also a hot topic right now, requiring that students go out in the community and exhibit people skills which the schools have totally ignored up to that point. And alternatives such as charter schools are showing that parents and students really crave the more humane, individualized education that we've lost through all the years of standardization.
We need to begin discussing these conflicting aims, trying to come up with some understanding about how the 2 worlds can cooperate with each other, both for the sake of the children, but also for the sake of the world we're launching them into. Instead of retreating into our separate corners, let's talk about how we can create forums and find some common goals. I'll work on this in coming months.
What are symptoms you see of these splits? Have you seen these cross-era forums happening? If so, where and how?