Featured Blog: ShadowCulture’s Bug Bash by Hans Bjordahl

ShadowCulture’s Bug Bash by Hans Bjordahl

I don’t know how I stumbled upon Bug Bash by Hans Bjordahl, but it’s extremely funny and very fresh. Each week there’s a new comic strip, and there is a regular (but not daily) textual blog to accompany the cartoons (typically not directly correlated).

I’m adding this to my Blogroll so I’ve got a handy link to it for myself (my number one rule of thumb for who’s on my Blogroll). Be warned that the comics are funny if you’re a tech geek, but I have no idea how they strike non-techies. I’ll calibrate by getting my English major daughters to take a look… 🙂

Developers: Expect New Major Language Within Five Years

Developers: Expect New Major Language Within Five Years by Daryl K. Taft

There’s really not much to this article, mostly a brief report from TheServerSide.com Java Symposium last week. The punchline is that we should expect a “major language” of the stature of Java within five years. Great, yet another programming language that I will have never coded in before being assigned to teach the class… 😉

Actually, my main motivation for sharing this article is that it gives me an opportunity to tell one of my favorite stories about Prof. Evan L. Ivie, who retired some years back from the Computer Science Department at BYU. Evan was at Bell Labs when Kernighan, Ritchie, and all those smart guys invented UNIX and C… in the group that he managed. He was like the Forrest Gump of computing. Seemed like every time something amazing happened in the world of computing, Evan was lurking somewhere nearby (or maybe managing the group that did it).

Anyway, one day in an operating systems class, Dr. Ivie made an offhand remark about “C, or whatever language was the fad at the time”. I was stunned. This was circa early 90’s and C was THE programming language that we all wanted to know, and that we all became really good at. We thought the ten commandments had been handed down to Moses in C, and then translated into Hebrew. What we didn’t know (but Evan did) was that C was the language du jour, and that it would be followed by another, and yet another, and yet another, ad infinitum.

What’s amazing today is the absolute explosion of languages that students and professionals actually know and use: C, C++, C#, Java, Tcl, Python, Perl, Ruby, Smalltalk, Scheme, MATLAB, Lisp, Delphi, Visual Basic, JavaScript, PHP, Prolog, SQL, Pascal, Ada, etc. (PLEASE, Please, please forgive me if I left your personal favorite programming language off this list… please! This is random, and off the top of my head!) It’s not the fact that these languages exist that’s so amazing. Lots of goofy languages have always existed. It’s the fact that they’re all relevant, and are all used by a large enough number of people that most software engineers today have at least heard of them, even if (like me) they don’t know much about them.

So I wonder what the heck we mean by “major” language like Java in the future. Is that our ultimate destiny? An uber-language of some kind? A multiparadigm miracle language? Or is our future to be a large community of programming polyglots, drawing fluently from a dozen different tool boxes? I’m not sure which one I find more exciting… either way it’s pretty cool!

“I’m sorry… Our movie theater is controlled by the corporate office in Texas.” Huh?!

One isolated incident is an anomaly. Two may be a trend.

Let me explain…

About a year ago my wife and I were regular attenders at a local movie theater in the Provo area (which will, naturally, remain unnamed). By “regular” I mean we saw maybe two movies a month. One Friday evening, while waiting for my wife in the theater lobby, I stood watching movie trailers on a really nice, big, flat-panel display hanging on the wall. I noticed immediately a mouse arrow obnoxiously situated in the middle of the screen. That told me three important things: 1) The display was being driven by a computer, rather than a DVD player set to loop; 2) Whoever launched the trailers application forgot to move the cursor out of the way afterward; 3) Nobody bothered to look at the output to see if it was working properly. Ever.

The next time we were at this theater (a week or two later) I sought out the display to see if the cursor was still there. Sure enough! This time, though, I also sought out a manager, pointed out the obnoxious arrow and suggested that someone could go to the computer that’s generating the video and move the mouse arrow to the edge. Here comes the staggering part… And I swear I am not making this up. The manager apologetically told me that the video shown on that flat panel display was generated by a computer in Texas, at the corporate office, that he had complained repeatedly to them about it, that nothing had ever been done, and that the people in Texas were gone for the weekend after 5:00 Friday afternoons.

The good news: The manager took responsibility, offered rational explanations for the observed phenomena, and didn’t make me feel like the obsessive techno-geek that I actually am.

The really silly news: Your video feed is controlled in Texas?! Ok, fine. Maybe I can buy that. But nobody in the corporate office will fix that?! Harder to swallow. And nobody at the corporate office is available on Friday and Saturday nights when most your weekly revenue shows up?! Hmmm.

So I chalked it up as an anomalous blip, an inexplicable amusement. Until this last Friday.

We were at a dollar movie theater on Friday afternoon for a matinee showing of “Night at the Museum” ($1 each before 5:00!). This theater shows late run movies, about the time they come out on DVD, so you can get the live movie experience (sticky floor, greasy popcorn, crying babies, cell phone interruptions, talking teenagers) for cheaper than you could buy the DVD and take it home to your own family for a home movie experience (sticky floor, greasy popcorn, crying babies, cell phone interruptions, talking teenagers).

As we walked up to give our tickets to the ticket dude, we found instead what appeared to be the theater manager standing there. As he took our tickets, he warned us that the recent cold snap had taken the theater by surprise after unseasonably warm weather, and that now the air conditioner was running despite the fact that it was about 35 degrees outside. He then continued to tell us that he had been on the roof to personally try and fix it, that it was no help, and that we might want to consider grabbing our jackets from the car, because… (drumroll please…) the air conditioning was controlled by the corporate office in Texas, and nobody was in that office after 5:00 on Friday, Texas time! I would not make something like this up. I swear.

The good news: The manager took responsibility, offered rational explanations for the observed phenomena, and graciously protected us from an unnaturally chilly movie viewing experience.

The extremely silly news: Your air conditioning is controlled in Texas?! No, not fine. Nobody in the corporate office is available on Friday and Saturday nights when most of your weekly revenue shows up?! Even more silly in this case.

It begs the question: Does what we gain in corporate centralization compensate for what we lose in local responsiveness? I’d have to ask the corporate office… but… they’re closed.

Sir Tim Berners-Lee Gives Congress Vision Of The Future

Sir Tim Berners-Lee Gives Congress Vision Of The Future by K.C. Jones

The title pretty much tells the tale. Nice summary of what Berners-Lee had to say to Congress. He is, of course, the inventor of the World Wide Web. It’s hard to come up with many inventions that have so thoroughly impacted the world like the Web. Automobiles? Air travel? Telephones? Cellular telephones? Satellites? Credit cards? The personal computer? Oh yeah, and the web is 17 years old as an invention, 10 years old as a part of most people’s lives. The evolution of the Internet and the Web is like a great sci fi story but with entirely unrealistic time frames.

Court strikes down Internet porn law

Court strikes down Internet porn law

I understand that issues involving free “speech” and the rights of adults are very complex, as is the interaction between government and the personal moral agency of its citizens. But as a parent I feel the current statutes and judicial decisions are sacrificing the moral lives of our children on the altar of adult choice.

In the ruling, the judge said parents can protect their children through software filters and other less restrictive means that do not limit the rights of others to free speech.

The Internet society in which we live is like a neighborhood in which strangers routinely walk into your house, inject drugs into your children, and then offer them more for free without your consent. I think we have to acknowledge the tremendously addictive nature of pornography, the deep impact of Internet porn addiction during the formative years of child development, and the overwhelming level at which our children are being exposed. We then need to begin imposing responsibility on those producing and distributing this drug.

Yes, as parents we should be doing everything in our power to protect our children. But the idea that it’s us who carry the burden of spending time and money to protect our families from perpetual moral onslaught, rather than the dealers being responsible for providing this material to our children? Sheesh.

“It is not reasonable for the government to expect all parents to shoulder the burden to cut off every possible source of adult content for their children, rather than the government’s addressing the problem at its source,” a government attorney, Peter D. Keisler, argued in a post-trial brief.

Amen, Mr. Keisler!

Technology experts said parents now have more serious concerns than Web sites with pornography. For instance, the threat of online predators has caused worries among parents whose children use social-networking sites such as News Corp.’s MySpace.

Follow this argument with me. X is bad. But Y is even worse than X!! Hence, don’t worry about X. Huh?! It doesn’t make sense. Why are Internet predators so pervasive and problematic in our digital society? Think it through… Because they are without fail already addicted to Internet pornography (typically, but not exclusively, child porn). They are acting out something they’ve fantasized about and visualized repeatedly online already. You want to eliminate Internet predators? Eliminate Internet pornography. The source is essentially the same. Pornographic material is addictive, the behaviors it induces in its addicts are base and degrading, and the social consequences are devastating to the addicts, their families, their children, and every other innocent victim involved.

Continued Drop in CS Bachelor’s Degree Production and Enrollments as the Number of New Majors Stabilizes

Continued Drop in CS Bachelor’s Degree Production and Enrollments as the Number of New Majors Stabilizes

Yet another article on CS enrollment, but this one from the Computing Research Association (CRA).

According to HERI/UCLA, the percentage of incoming undergraduates among all degree-granting institutions who indicated they would major in CS declined by 70 percent between fall 2000 and 2005.[1] Unsurprisingly, the number of students who declared their major in CS among the Ph.D.-granting departments surveyed by CRA also fell (Figure 1). After six years of declines, the number of new CS majors in fall 2006 was half of what it was in fall 2000 (15,958 versus 7,798). Nevertheless, this was only a slight decline from the 7,952 new majors reported in fall 2005, and may indicate that the numbers are stabilizing.

It’s a very short article with several enlightening graphs. I particularly like the concluding paragraph, which lends some historical perspective that seems to be lacking in almost every other article I’ve seen on this subject.

It is important to note that a steep drop in degree production among CS departments has happened before. According to NSF, between 1980 and 1986 undergraduate CS production nearly quadrupled to more than 42,000 degrees. This period was followed by a swift decline and leveling off during the 1990s, with several years in which the number of degrees granted hovered around 25,000. During the late 1990s, CS degree production again surged to more than 57,000 in 2004.[2] In light of the economic downturn and slow job growth during the early 2000s, the current decline in CS degree production was foreseeable.

‘Cute Knut’ delights German crowds in debut

‘Cute Knut’ delights German crowds in debut

b-topper.jpg

Okay, so this has absolutely nothing to do with anything related to anything this blog is supposed to be remotely about.

But…

The new cuddly baby polar bear at the Berlin Zoo has been dubbed “cute Knut” by the German media. It’s now just a matter of time before one or all of my children come home from school bearing (as it were) tales of a new nickname.

Granted, there’s not a lot else you can do with “Knutson,” other than simply mangle and mispronounce it. I regularly get phone calls for “Charles Nutson” and it helps me to know that it’s nobody I know. In my young married years I would provide “Knutson” when waiting for a table at a restaurant, until one night at the Space Needle in Seattle a hostess hollered out, “Contusion?!” “Contusion?!” Everyone looked around for the idiot named after a bruise, and I stood up against my better judgment. It was that or lose the table. After that it was always just “Chuck,” thank you very much.

But now my kids (and me too, I suppose) get to be “cute Knut” thanks to the bear and the Germans.

Students staying away from IT majors

Students staying away from IT majors, by Nicole Dionne, Providence Business News.

Thank you Nicole for a positive article that appears to back my bold prediction from an earlier post.

Despite the fact that high-paying technology jobs are plentiful, students have been shying away from the profession and leaving a talent gap across the country.

“We have more jobs than students,” said Hal Records, chairman of the computer information systems department at Bryant University. “In 2001, with the dot-com crash, a lot of parents lost their jobs, and everybody said technology is not a good way to go.”

At Bryant, he said, “we’ve gone from 119 graduates in 2001 to 60 last year, and we’ll probably graduate 40 this year. The irony is, the jobs never really went away. In spite of the outsourcing, the number of technology jobs is increasing. So what we have now is a massive gap between the demand for technology graduates and the supply of technology graduates.”

Amen. That’s what I’m seeing. Glad to know I’m not the only one… 🙂

Is computer science dead?

Is computer science dead?

Without intending to do so, I appear to have launched a series of posts on the topic of Computer Science enrollments. I guess it’s been getting quite a bit of press attention lately, and we’ve certainly had our fair share of discussions on the subject in the CS department at BYU.

With respect to the title of this article, it’s one thing to note that there are upturns and downturns in any industry, particularly in one as volatile as ours. But it’s the “X is dead” alarmists (or the more subtle “Is X dead?” alarmists) whose arrival officially announces the popularity of issue X, whatever it happens to be. So here we are.

The author of this article leads by citing the article by Neil McBride, whom we already discussed in a previous post.

After some gloom and doom reporting of dropped enrollments, and a moving story about a young woman whose family insisted she would never get a job (but who did — wonder of wonders, miracle of miracles), the author settles into some decent analysis, culminating in some potentially encouraging (or at least pleasantly neutral) news.

“Some of the newer aspects of IT are more prescriptive; they require less innovation. Computer science is more cutting edge,” Professor Looi says.

The university offers a general IT overview in the first year of its technology degree, branching into business analysis, software development, system administration and computer science. There are 40 students in computer science, down from 200 15 years ago.

But Professor Looi still believes there is a market for purists. “There will always be a need for computer scientists or technology will stop advancing,” he says.

“It will only advance on what has already been invented. We still need people to create, but possibly not as many as we needed before.”

Or possibly more than we needed before…

Or possibly a greater diversity of professional destinations for technical people than before…

Or possibly…

At this point I’m just not buying the demise reporting. I think two issues have contributed to decreased enrollments: 1) the dot.com implosion and 2) the threat of outsourcing. I’ll hit those in greater detail in a later post.

Meanwhile it seems to me that the market is once again heating up, and employers in the Utah software industry can’t fill spots because demand is so far outstripping supply at this point. Salaries are going up to where they were 6 or 7 years ago during the paper-thin over-heated boom.

I’m going on record right now and predict that enrollments in Computer Science worldwide will begin rising no later than Fall 2008 (although my heart is screaming “2007!”). I further predict that Computer Science enrollments will exceed 2001 enrollment levels by 2015. Yes, I know about outsourcing. I’m including that reality in my prediction. I’ll try and write a bit more about that China and India and what outsourcing implies as soon as I can get to it.