Version Control and Higher Education
medium.comI haven't met any developer who was incapable of learning a version control system on the job. Why should universities spend the student's very expensive class time to learn about this? That time would be much better spent learning about fundamental knowledge such as algorithm design rather than learning about a specific tool that could be obsolete by the time the student graduates. Also, in a first semester course on programming, learning version control is a distraction from all the other things you need to learn to write a working program, like programming language syntax and building a mental model of how program execution works. (Not every CS student gets to college already knowing this stuff!)
Which is not to say that universities shouldn't make these tools available and encourage their use for class projects. But that doesn't require them to "integrate version control into their curriculum" like the article suggests. The documentation for git is freely available on the web and anyone who wants to can read it on their own time and ask their fellow students about it.
The larger philosophical question here is whether higher education is a vocational program for creating good "team players" who are productive with the current technology on the first day of their employment, or rather to provide an education that teaches them fundamental knowledge and how to learn new things on their own.
Side note: I've taught a first semester CS course in a well-known university, so I have some experience with the difficulties that even some very smart students face while trying to pick up the basic concepts. Adding in version control would have made it harder for some of them to successfully complete the course.
So, I had at least one project at school fail completely because we weren't using version control. Had the culture of the institution been "You will use a repo for this class, and you will learn to like it" this may have been averted (worse, I already knew and used repos for other things--I just derped with my partner on that project).
I think that including these basic tools in an intro course (perhaps as a first or second class topic) would be a good idea. The minor waste of expensive education time will be more than offset by the added productivity in later courses, and a grading workflow built around test suites that one simply pushes code to would probably make students and graders lives easier.
An anecdote:
One of my CS courses spent a week or two going over C and its pitfalls. Now, one could argue that any sufficiently bright student should be able to pick C up incidental to completing a course on programming in 'nix, but the fact is that the additional material covering things like dumb use of unions and linker errors and whatnot saved the ass of several classmates time and time again, both in school and in later years.
You can't have a purely theoretical education, especially in a field where at the end of the day you need (to quote Zed) "programming, motherfucker."
Students will in all likelihood get far more mileage out of a fast intro to VCS and *nix over the rest of their academic and professional careers than learning one more esoteric data structure.
By similar logic, mechanical engineering courses should never require students to learn the rudiments of welding or machining.
I'm pretty much failing to make sense of your last sentence. Is that "never" supposed to be there?
He means "by similar logic [to the post that I am replying to]". In other words, he is claiming that using the arguments from the post will lead to making that statement with the word "never". He himself probably doesn't believe "never".
This is the correct interpretation--looking back on it the wording is ambiguous.
Ah, that makes sense.
> Why should universities spend the student's very expensive class time to learn about this?
To turn this around, why should businesses pay former students to learn about this on the job instead of paying them to be productive?
From my experience, there's a large number of developers (both young and old) out there who are very capable programmers, but seem to not spend any of their spare time learning things on their own after leaving school. It's insanely frustrating having to teach new junior developers how to properly use a tool like git when they've never even been exposed to version control, all while they're supposed to be productive. It really can become a waste of company resources.
Are they capable of learning version control on the job? In most cases, absolutely, given enough time and buffer for mistakes. However, I believe, as part of professional development in the field they chose, it should be something they learn outside of work and preferably before getting the job. University seems like a very valid place to at least introduce and encourage use of version control systems early on, if not at the very least in a higher level course.
"To turn this around, why should businesses pay former students to learn about this on the job instead of paying them to be productive?"
Because it's expected that entry-level employees don't have a lot of experience. It's even possible that these young people have taught themselves all sorts of useful things on their own already, but haven't gotten around to teaching themselves version control or the particular database system or development environment that your company uses. I think it's a better investment to hire someone who is smart and a fast learner than it is to hire someone who happens to know the particular tools you currently use.
In the software field, we've somehow decided that it's normal for entry-level employees to have lots of practical experience. But in most other jobs, that's not the case: they learn at their employers' expense. How many new hires on Wall Street have ever used a bond trader's workstation? How many newly-hired railroad employees have ever driven a train? How many newly-hired lawyers have ever represented a client in court?
If you want people who know git, then hire people who know git. If you don't want to pay for people who already know git, then you're going to have to pay people to learn git.
Spot on. There are 10,000 things about working in "a real job" that you either don't learn or don't make sense to teach as part of an undergraduate education. University is not vocational school.
I agree that there are more important concepts that need to be covered in freshman/intro level CS courses, but some of the graduates I was meeting and interviewing had not even heard of version control or source code management, let alone spent any time collaborating with a team of developers. They were not getting any exposure to it at any level within their program. It didn't even seem like something they had been introduced to. IMO, source control management should be as essential as ssh-ing into a remote server or running/compiling a program in the shell.
One way or another, I feel they should come out of college knowing something about version control, but not in the first class. I can easily believe greenyoda that trying to learn programming and, say, git at the same time would be a disaster. I'm tempted to say you should just warn them that at some point it's going to be part of their class workflow, but I don't think that would fly bureaucratically. At the same time, it would be weird to just have a class in "version control"; it only makes sense in the context of another project. So maybe make it part of a lab-style class, with a focus on individual instruction, and RTFMing.
Then I get to my internship and have to learn Team Foundation Server... Sheesh.
This is the case in the classes I teach. We use Git as the primary vehicle for submitting assignments and tracking work. I usually spend the first week introducing students to the Linux shell and interacting with Git and Github.
Why not assign something like http://try.github.io/levels/1/challenges/1 as homework instead?
I strongly agree. I met new graduates and some developers, 60% who don't understand about version control system.
I also agree. I've met university lecturers who didn't know version control more recent than RCS existed.
I blogged a response: http://jarofgreen.co.uk/2013/05/why-programmers-should-learn...
Basically, I think it's important to teach Git (or other Version Control) to students because what you are really teaching them is good development practices. These take time to learn and get used to and Uni's absolutely should be teaching them.
The whole issue of "students should be able to teach themselves git" is a bit of a red herring, IMHO.
One of the classes I teach is at the senior undergrad level, and they have been thrilled that I have them use version control. Many have commented that they wish that they had known about git in earlier classes, as it would have saved them when they screwed up their programs.
This comment is so good I quoted it with citation, hope you don't mind. http://jarofgreen.co.uk/2013/05/why-programmers-should-learn...
At the University of Montreal, one teacher introduced Subversion in a Software Engineering class and another Git in a compiler class. Version control is taught at university, it just depends which university.
Oh, I thought that was going to be about code written by education people. Actually, it's a larger problem - I'm constantly shocked that anybody who writes documents for a living (code or otherwise) is relying on the filesystem. My wife's a teacher where her department pools their lesson plans and tests and whatnot and I'm constantly harassing her about the importance of setting up some kind of VC to keep track of all that crap.
The git log could be a gold mine for researchers studying how CS students learn CS concepts and programming techniques. It could also be used to detect cheating.
commit d6abf62d1981a6c16029d10a5ecbe0c5c494322b Author: Biff Joe <biff@bork.edu> Date: Thu Mar 28 12:21:11 2013 +0000 Change copyrights to my name (woo, dodged a bullet there!). commit 9c66001752f643e8f2a3453662352cf26402cee3 Author: Biff Joe <biff@bork.edu> Date: Thu Mar 28 12:03:01 2013 +0000 Change variable names. commit e9b9a740ca52e6fc2843d2d5aed2865c56ace45a Author: Biff Joe <biff@bork.edu> Date: Thu Mar 28 11:39:44 2013 +0000 Import Nigel's code.
This is like those mandatory undergrad computing courses where they teach you to use word by typing into an open document. If you can't figure it out with a few minutes and access to the help files, maybe computers just aren't your thing.
I feel many of my upper division projects would have benefitted from having the ability to branch and try a new approaches safe in the knowledge they could be easily abandoned if they failed to bear fruit.
This page also doesn't scroll with chrome on my nexus 7
Same, I'm wondering what kind of bug makes that happen.
Who are these CS students who need their hands held to such a degree? A simple, You should consider using version control for your projects. Try git or subversion. It will come in handy later in your career at the beginning of an introductory programming class should be all that's necessary. A CS curriculum doesn't teach students how to touch-type either, nor should it.
I'd suggest to people doing hiring that the candidates who haven't even heard of version control (which some other posters have mentioned interviewing) are ones that you don't want to hire. You should be glad to have the easy selection criterion.
I agree completely, but I also teach non-CS students mostly in design. If version control is a foreign concept to some CS students, it's not even on the radar of most of my design group.