« LCB March 2007: What Would You Do to Support New Managers? | Main | Learners Are "Bulletproof" »

Mar 27, 2007

Comments

Nice post. Yes, those predictions at the end came straight from Kurzweil's book. And I decided not to include some of his other predictions (around health care, human longevity, and the singularity) because I was worried that would detract from the rest of the message for my staff.

I've used Hoffer's quote as well, and I think we really need to think hard about what it means for education if Kurzweil is right and we are in the knee of the curve.

Thanks for the link and for sharing your thoughts. Karl and I have enjoyed seeing how this video has spread!

Thanks so much for the comments and words of support, Karl and Scott!

The fact that one can comment upon something that someone they've never met has created, only to have that person reply directly in a matter of hours is yet another example of how the world is shrinking/flattening and how technology brings us all closer. Amazing...

As you mentioned, Karl, this pace of change is going to force a deep reflection on and reconsideration of our current educational strategies, whether we like it or not. What worked yesterday almost certainly won't work tomorrow.

It's the old '..if it was good enough for me..' syndrome. "You and I are products of this (supposedly) 'broken' school system, and we didn't turn out too badly (have good jobs, make decent money, etc.), so just how bad can it really be?" The fact is, it's probably worse than we think because even if we adopt wide/deep/progressive changes today, there's still a large segment of tomorrow's workforce who will be orphaned, not equipped with the skills they'll need to compete.

Two final quotes to consider:

"Faced with the choice between changing one’s mind and proving that there is no need to do so, almost everybody gets busy on the proof."
-- John Kenneth Galbraith

"The new education must teach the individual how to classify and reclassify information, how to evaluate its veracity, how to change categories... how to teach himself. Tomorrow's illiterate will not be the man who can't read; he will be the man who has not learned how to learn."
-- Herbert Gerjuoy

a good example of the shifts in the e-learning domain:
http://www.sutree.com

it's a social bookmarking site (kinda like Digg) for free video lessons & tutorials.

good one. I am not sure that I trust all the numbers, but I agree with most of them.

That's a very interesting video, but parts of it seemed a bit misleading.

For example, the part where it claims that students are being prepared for "jobs that don't exist, using technologies that hadn't been invented...", or that "half of what they learn is outdated by their third year". Those really aren't logically sound conclusions; I could explain my reasoning if you like.

The bits saying "by year 'foo', a $1000 computer will exceed the capabilities of 'bar'" were also quite meaningless. Capability cannot be measured in instructions per second. Intelligence -- computational capability -- lies in knowing what these instructions should be. A computer that does one instruction per second, runs for billions of years, and solves conceptually complicated problems is more capable (in my opinion) than a blazing fast supercomputer that runs nothing out of the ordinary. In other words, I think it's software, rather than hardware, that defines how capable a computer is. And unfortunately, we won't have software clearly more capable than the human brain any time in the near future.

Why am I making such a big deal out of this? Because I think excessively optimistic predictions, well presented but without a sound basis, do more harm than good. I read one of Ray Kurzweil's books a couple of years ago, and was disappointed. Kurzweil seems like a smart guy, and has made some contributions to the field over the years, but I really think his conclusions build false expectations. Wild speculation should be presented as wild speculation.

Thanks for taking the time to weigh in, Nath - it's sincerely appreciated!

I'd love to hear more about your doubts regarding the items you mentioned (espec the bit about students prepping for jobs that don't exist). As with any speculation about the future, it's not really a matter of "right" vs. "wrong", but which of a myriad shades of gray of potential and plasible scenarios emerge. Debate on topics like this are healthy, as I'm sure that no one (you, me, Kurzweil) has it totally correct.

Having delivered that invitation, I have to admit that I fall more on the other end of the spectrum of optimism (surprise!). While you aren't the first (nor will you be the last) to criticize Kurzweil for being over-the-top in his predictions, the exponential trendlines he has researched are pretty stunning and thought-provoking (and, I'd argue, constitutes a "sound basis", even if you don't agree with the conclusions that are drawn).

I guess the heart of my response to your comment (espec related to "excessively optimistic predictions...doing more harm than good") is this:

- I tend to think that provoking thought (even if it's via some exaggerated claims) is, more often than not, a good thing. Even if the claims never materialize, they force deep thinking about knotty issues, which further our collective understanding.

- Change is afoot. Not "just like we've historically seen"-sort of change, but SERIOUS shifts in multiple areas of our lives. And it's occurring as we speak. Usually change is too slow and subtle to be consciously noticed, but the reality of this pace change isn't an academic theory, but can be seen by any common person who is aware of their surroundings. The details around the exact dates and quantitative amounts of the change (as quoted in the video and/or Kurzweil) may be (likely will be!) 'off', but the trendlines are not.

There is a certain similarity with the new (renewed?) attention that global warming is receiving, of late. People can argue the validity and meaning of one study or set data versus another, but the bigger picture of the trendlines is the point that should not be lost in the debate.

The interesting thing about change is that it happens whether you recognize and/or agree with it or not. :-)

My main problem with the claim is, on what basis does the video make the claim that the technology used for these jobs does not yet exist? It's an exceptional claim, and thus requires exceptional justification. Computer science, for example, is considered a fast-changing field. And yet, most computer scientists and programmers spend most of their time working with technology that has existed for years -- decades, even.

Provoking thought is certainly a good thing. I've been meaning to write about the role science fiction plays (or at least, should play) in giving direction to human thought. But science fiction is still fiction, and is presented as such.

Excessively optimistic claims are genuinely harmful. This is because, five, or ten, or fifteen years down the line, people will ask questions when all the predictions fall through. Potential researchers are disillusioned. Funding agencies and investors become disillusioned. In the long run, overconfident predictions are detrimental to progress. A great many predictors of the future are vastly overconfident. (As evidenced by the lack of human civilizations on Jupiter, and superhuman AI, and so on and so forth.)

This isn't speculation. I am an artificial intelligence student; this has happened in my field.

Interesting thoughts, Nath - thanks for taking the time to expand on your original post.

In some areas, we'll have to agree to disagree - much of this is opinion and cannot be "proven" one way or another. Either way, you've provided some rich "food for thought"!

I will say that the World Wide Web is a powerful example of a technology used in jobs that didn't exist even 2 decades ago (less than one generation's schooling). Anyone studying Comp Sci even up through the late 80's didn't have an inkling that HTML and the myriad of accompanying technologies related to the Internet were going to have the prominance they have today. I can't imagine telling a student today a specific technology they should study today to insure their employability for tomorrow (broad categories and general domains, perhaps).

I lived through the tail-end of the AI boom (have my MS in AI/CS), so I understand where you are coming from, in a sense. But I cannot help but wonder if the outrageous (in retrospect) claims that were made didn't, in part, fuel the progress that actually occurred in similar and different fields of study? Hmmm....

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man."
- George Bernard Shaw

Yes, new technology does emerge on occasion. I should more accurately have said that computer scientists spend most of their time working with *principles* that have existed for decades.

The occasional appearance of new technology is no justification for the claim in the video, that "half of what [college students] learn in their first year will be outdated by their fourth year of study". I can't remember a single thing I learned in my first year that was relevant then but isn't today. (As a former computer scientist student, I'm sure most of your algorithms classes etc. still serve you well, if you are still in a related field.) Syntax changes, but syntax is nothing. A distributed programmer stepping out of suspended animation from 1980 would probably find himself able to be productive in today's society after spending just a few months orienting himself with all the changes. Indeed, most network code is still written in C.

The comments to this entry are closed.

Search This Blog


  • Loading