Last night I has a bit of a party for my 33rd birthday. One of my friends whom came over is a IT lecturer at the local university. During the evening he asked me my opinion of whether they should teach C# in first year. He said that some of the local industry had been asking about it and he was wondering what the university should do about it.
I remember a couple of days ago on slashdot there was a
discussion about programmers/coder versus software engineers, which is very relevant to this question.
I believe the question is whether not universities should teach you how to program in a certain language OR teach you the fundamentals of computing so you can teach yourself how to build computer systems ( in any language/environment). I personally prefer the later approach, because in most cases the languages you learn at university are probably not going to be the langauges you end up programming in anyway, ( l learnt fortran, pascal and c at uni ) and I believe it is much better to teach the fundamentals of computing, compilers, operating systems , databases , networking and if you are going to teach specific programming languages then leave it till the last year so that students already have the fundamentals in place, and also the languages are more as relevant as possible.
It is a big problem in Australia, because in an environment where young people can get a job in a mine driving a truck down a hole for $120000/year, there is alot of pressure to try and attract people to university. This puts pressure on universities to try and make their first year courses as "sexy" as possible. The problem with this approach is that you inevitably end up dumbing- down the course work to attract more people and teach only the most "exciting" concepts. This in-turn leads to less of the "boring" fundamentals which means graduates learn less of the things that will actually make them good system builders in the future, and have less of an understanding of computing as a whole. The end result is that noone bothers to go to university because they see the academic level of the graduates, and what they are learning and cannot see the value in it. If university degrees end up being .net ( or java ) courseware why would anyone bother.
I think that the IT industry has already suffered through alot of this, and I personally believe the best software engineers I know are people who have had experience in many areas of computing ( admin, network support , database admins ) as well as business skills.