Something that I’ve recently been thinking about a lot is what should be taught at Universities, particularly in my area (Computer Science). What has been bugging me is trying to think about the balance between coverage and detail. Is there a bias as in you should focus more on one and not the other, or should they be covered in equal amounts.
A great way of applying this is using development concepts. Do you teach students many programming languages, giving them experience of many different syntaxes, and methodologies of programming. Or do you concentrate on fewer languages, but going into more detail about what you can do with them.
Personally, I’m inclined to go with the latter (to a point). Concentrate your efforts and get them able to apply their current knowledge to other areas, which they have not been taught. However, this has problems. Could the jeopardise the students in the future, make them feel restricted from branching into the unknown? Could it also mean that they don’t look as good to an employer?
Although I’ve used programming as an analogy (as it’s something I’m familiar with), this balance can be applied to any learning situation.
Comments please.