Freshmans First Language

What programming language should we teach freshmen first? (EnglishLanguage?)

It's too late to learn one's first programming language in college. Compare with FirstLanguageLearned and FirstTimeLanguage. That said, as a prospective programming professional one should learn as many ProgrammingParadigms as possible. It is more important to learn how to learn a language (LearningProgrammingLanguages), than to learn a particular language.

We want freshmen to learn

It would be nice if the language were

I suspect no single language could ever do all that. Maybe the best would be to have students study several languages, with bonus points for figuring out which languages are more suited than others to specific kinds of tasks, and extra credit for outlining the areas of overlap.

The consensus seems to be that some interactive high level language is the best first language (with a wide variety of opinions), but that Comsci students should immediately learn a low-level language (which everyone agrees is one of: CeeLanguage, AssemblyLanguage, C++ used as a slightly improved C, or ForthLanguage).

Perhaps we should split this page into FirstTimeLanguage (for the high-level language stuff) and FirstLowLevelLanguage? (for the low-level stuff).
I'm surprised about all the suggestions for C or assembler. In a freshman class, it's best to stick to an easy-to-use language with few gotchas. After all, you want the bottom half of the class to succeed. Java seems like a reasonable possibility, since it's also a useful language to know. Students can learn C, assembler, LISP, etc. in later classes, but it's best to start with baby steps. -- JaredLevy

No, I don't want the bottom half of the class to succeed. Hoping everyone does well is one thing, but carrying the tradition of giving every loser a trophy for showing up into the computer science and engineering colleges is wrong minded. There is such thing as being not smart enough to be worthy of others spending their time on your code. That some (or all) code produced by any given programmer will need attention by others at some point is a given provided the software is used by more than the original author. We have spent far too much time in the last decade inventing ways to efficiently identify bad code, interface with bad libraries and otherwise deal with the glut of poo that is proliferating ever faster since schools stopped teaching C and started teaching Python and Java as first languages instead. Not requiring electrical engineering as a prereq was probably the beginning of the bad trends here. If we simply carry the bottom half, which doesn't stop in freshman courses but becomes an institutional goal by the time that freshman is a junior, we merely pollute the working pool and undermine the value of a CS or engineering degree. -- CraigEverett?

If Java is suitable for baby-steps, C is too. I can't see why and how Java can be simpler to learn than C. -- MauroPanigada

: Three words: automatic memory allocation. No messing around with pointers. Java has other problems that it a bad language to start on, but they got that one right. --MarnenLaibowKoser
I've taught a few large first-year programming classes, and here are some of the things that must be balanced:

I agree with those who suggest that we emphasize lower-level details in early programming courses, and that we provide students with a good mental model for thinking about programming. C's model of computing is nice since it's so simple, and so ubiquitous.

If the problem is emphatization of details that we would like to disregard when implementing an algorithm, then the better language to learn is a pseudolanguage that changes to fit the particular need of the algorithm we need to code. I.e., no language at all, since the problem can be described and "solved" in any pseudolanguage. If a student is able to express the algorithm in a pseudolanguage which s/he invented in a way it is comprehensible also to anyone who does not know the "pseudolanguage", then s/he surely (?) will succeed in putting the algorithm into real code in any existing language, provided that s/he makes the needed efforts to get into the basic of the language. -- MauroPanigada

Hmm. This is a laudable approach, but ignores the basic act of programming as using a computer to perform a task. Programming isn't necessarily about problem solving, but it is always task oriented. One gets a task accomplished with the tools one has at one's disposal. One changes spark plugs with a socket wrench. One sweeps a floor with a broom. One sorts addresses in a list with a qsort.

As such, the use of a computer to execute a program can't simply be abstracted away from the reality of the machine operating in the background. It is imperative that the student know what the computer may be doing by itself if not told specifically what to do, and even when it is given specifics. Languages like C/C++ and Java are close enough to the hardware that the student gets a glimpse behind the curtain even when creating most simple applications. This is a Good ThingĀ® for the student to have under his belt.
I've always been amazed how seniors at an accredited computer science school have managed to get so far without understanding pointers or how memory really works. I also am amazed at the lack of understanding of the big picture of putting a program together. This set me to thinking about what the best language to start with would be. On the one hand, make students do largish projects in ASM wouldn't be too cruel, and it would certainly make them understand how memory works. I've come to the conclusion that even if you only want to go become a VisualBasic programming doing DB work that a firm understanding of ASM and the ability to do real projects in it is essential.

On the other hand, I'd like see people understand how to put together large systems. Asm doesn't lend itself to this. Python, VB, smalltalk, et cetera do.

So, the question is, is it better to start at a high level and work down, or to start at a low level and work up? Personally, I started with the old line-numbered basic, then went to quickbasic (moving up I guess). I then went to C (down) then C++ (up), then ASM (down), and am now bouncing all over the place between C/C++, assembly, python, perl, smalltalk, etc.

I have definitely come to the conclusion that C/C++ and java are just about the absolute worst places to start.

Just a data point - at my alma mater, PolytechnicUniversity, the first language taught was Pascal (we used PeeCees with TurboPascal v3). This was the language used to teach basic program construction, as well as to demonstrate other basic concepts like data structures (lists, queues, stacks, etc.) The next language taught was AssemblyLanguage (68K, specifically). The assembly course roughly coincided with the microarchitecture course, which was helpful. CeeLanguage and CeePlusPlus came later, and were not heavily emphasized. Pascal was used all over the place (I wrote a CPU simulator for one class in TP3, and my senior project - an OO windowing GUI for DOS apps - was written in TP5.5). I didn't really learn C or C++ until I got out of school.

As I understand it, Java is now the teaching language there. -- MikeSmith

At CarletonUniversity, my freshman year was the last to have SmallTalk as our first language with C/C++ shortly after. Now we've switched to Java. This has lead to disasters in the upper year courses as previously feasible projects are becoming impossible to implement in the necessary timeframe. For example, in my data mining/artificial intelligence course, we had to write a backwards chaining inference engine. The class was split between Smalltalk and Java. All but one Smalltalker got it done and working. Only one Java implementation worked, and that was direct plagiarism of JProlog (so, really, it didn't work.). -- SunirShah
AmherstCollege? as of 2004

Amherst College teaches Java for the intro courses, and uses it for all the general CS courses it can. OS Design is in C/C++, Programming Language Paradigms includes Haskell and Prolog (though the bulk of programming is still in Java), and AI uses Lisp and Prolog. Some of my compiler design prof's code made me think I was programming C, though.

Personally I think C or Scheme is the best instruction language. Both are small languages, reasonably simple to pick up, and are very good at illustrating basic CS concepts. The quirky syntax is easier to pick up as a freshman, before being corrupted by other programming languages (prefix syntax in particular is good to introduce early). They're also not too esoteric - C is almost the lingua-franca of computing, and Scheme is used in a reasonably large number of places.

Python gets an honorable mention for having a very non-threatening syntax and being able to do many cool things with it. But it doesn't really illustrate any CS concepts. -- JonathanTang
Harvard used C as of the late 90's. Johns Hopkins uses Java.
University of Guelph in Ontario used JavaLanguage for the introduction in comp-sci/comp-eng programmes, and then quickly moved on to abusing them with excruciating amounts of C in second year. Then they'd release the comp-scis into the wild, fun world of PythonLanguage for high-level design, and the comp-engs into the painful world of various microcontroller assembly languages and AdaLanguage. Other, non-computer engineers got one crash course in C because they will occaisionally be expected to analyze a problem using C libraries... a course they don't get a damn thing out of - as such, they eschew all programming languages but MatLab and WaterlooMaple?.
University at Buffalo, as of 2009, uses JavaLanguage for the introductory sequence, CeePlusPlus for the introductory courses on algorithms and data structures, CeeLanguage for systems courses, a bit of VerilogHdl and MIPS for the required hardware courses, CommonLisp for AI courses, and a grab bag for language courses (depending on the whim of the professor; mine used CommonLisp, SchemeLanguage, FortressLanguage?, SmlLanguage, PrologLanguage, and CsharpLanguage). Professors often require more languages, especially for electives, such as bash, RubyLanguage, PerlLanguage, PizzaLanguage, SqlLanguage, DatalogLanguage?, SmallTalk, or MatLab. You can write your senior project in anything you can talk your teammates into going with.

Personally, I, from my imperfectly neutral perspective, think that the multi-language route is definitely the way to go. If I have a complaint, it's that they stressed the OO angle an awful lot, especially in the first year. -- ThomSmith?
UniversityOfWashington, as of 2012, uses JavaLanguage for introductory courses, CeeCeePlusPlus for data-structures/algorithms, systems, etc. CommonLisp and PythonLanguage for AI courses, SchemeLanguage, MlLanguage, and RubyLanguage for language courses
I still think that the PascalLanguage is a good teaching language. It has records (C struct) and pointers. But, unlike C/C++, you don't have to use pointers until you need them, and the language restricts what you can do with them. The PascalLanguage can teach you that CompilerErrorsAreYourFriends.

I would also teach them the SmalltalkLanguage, so they understand what real objects are like.

A variety of other languages would be good exposure too, but it's important to get one or two "good ones" under your belt for the majority of school work.

In the early '80s at the California State University at Fullerton (CSUF), BASIC (RSTS BASIC-PLUS) was very popular. The students would often fail to get their programs running, but it was very popular. -- JeffGrigg

I'd say Smalltalk. I've been playing with the idea of designing a language recently, and just in the area of syntax, there is this fundamental schism between the message send syntax that Smalltalk and Self have and all of the languages with Algol and Fortran in their lineage. You just can't do either halfway and keep it simple in my opinion, and there are benefits in the Smalltalk way. I think the closest thing to a middle ground is Ruby. Come to think of it, that would be a very good first language. -- MichaelFeathers
If we are talking about a First Language to be taught to College Freshmen, to both CS majors and interested students in other disciplines, then I have the following thoughts:

There are two types of freshmen: the ones who already know how to program (with all the bad habits they learned) and the true novices who don't know their IF from their ELSE.

I'd prefer to save money for college students and universities, so I would suggest an OpenSource language that runs on Unix, Windows, and Mac OS. I'd also suggest an interpreted language. Learning to use a compiler is an important skill, but I think InterpretedLanguages are more helpful in learning how to program.

So my vote is something that enforces good programming habits (to correct the experienced programmers' defects), something with syntax that approaches natural language (to aid the newbies), something that's 'relevant' (to most college freshmen that means they can use it in their webpages), and something that can be ramped up to include ObjectOrientation.

As a novice programmer, I'd recommend the PythonLanguage. It's easier to read than Perl (after all, someone's going to have to grade the SourceCode), the syntax is pretty natural (see HelloWorld in Python), it can be used with the CommonGatewayInterface, and you can do ObjectOrientedProgramming with it.

-- SeanOleary (Sometime in 2001)

It's a year later, and I'm in the process of changing my mind. SqueakSmalltalk has all the things going for it that Python does and studying SmalltalkLanguage has really helped me understand ObjectOrientedProgramming. I'd still choose Python though. See you in 2003. -- SeanOleary (Sometime in 2002)

You there Sean? It's time to make another comment ...

'low-level languages: C, assembly, and machine code (CeeLanguage, AssemblyLanguage)

Maybe you could look at it this way:
If the goal is to use a computer as a tool to explore itself, a lower-level language is a good choice.

If the goal is to use the computer to solve problems beyond the computer itself, a higher-level language is a good choice.

You must learn C and assembly. Those are your low-level languages, and they are not debatable (with some flexibility for replacing C with C++). The choice of a high-level language is what's up for grabs because the industry hasn't standardized on problems to solve like they have on the hardware used to solve it. That is, while we all use von Neumann machines, we don't all program n-tier web applications, medical scanners, flight simulators, or wristwatches.

I don't see anything special about C and assembly (or machine code). It seems likely that there are better languages for talking to hardware, but we haven't bothered to invent them yet. The native language of hardware is, currently, voltages and suchlike. This may yet change - but that's beside the point; machine code is itself something that requires further translation.

I have never met a coder who I consider to be fully competent, and who hasn't done some assembly. I *have* seen people improve by leaps and bounds after learning a bit of assembly. While I wouldn't suggest it as a first language (although it was mine), I find that programmers with no assembly experience often have huge gaping holes in their understanding. So much so that I am very wary of hiring anyone who hasn't ever done it...

What competency does knowledge of assembly give you? I'd much prefer having a programmer with a good understanding of algorithms, abstract data types, and complexity than one who knows how the bits are being twiddled in the processor. Sure, a programmer should understand how memory is arranged, how processors work in general, etc. But you get that in Operating Systems and Architecture classes without ever doing any assembly programming. -- BrianRobinson

I don't know any highly competent programmers who don't know a bit of assembly. I only know one who actually uses it these days. So perhaps it is just highly correlated to competence, not causal at all. On the other hand, learning assembly will teach you a lot about computation in general; this could lead to a better understanding of algorithms, ADT's. It will certainly teach you something about complexity, and give you a much better idea of what a compiler does, what a virtual machine is, etc.

I wonder, however, if it would be better to start off with a FunctionalProgrammingLanguage. If you start at an early stage with assembly or C, does that emphasize linearity of thinking too much? -- AndyPierce (who would like to try HaskellLanguage but doesn't have time anymore :( )

I agree. Simple functional language could be ideal to start with (see the PLT scheme effort, for example). Other languages should follow; introduction to programming should include several language *families*, not merely several languages.

Cambridge University actually teachs ML (a functional language) as the first programmign language to undergraduates(freshman). It is generally taught as "If you can understand mathematical induction, you can understand this code, and then we can teach you some other code from there."
My formal education in computer science started out in SchemeLanguage, which was a nice language to learn different approaches, like functional and OO programming. But almost any modern and truly powerful language would work for that. I think C may have a role as a lingua franca of programming, which makes it important to learn, but I guess you could live without it. -- AndersBengtsson
At Imperial College, London, I was taught functional programming (with Miranda), then imperative programming (MODULA-2), logic programming (Prolog) and assembly (PDP-11). The course covered medium-scale design using modules and data abstraction in MODULA-2. The second year covered large-scale design, with courses in concurrent and distributed programming, compiler design, object orientation (SmallTalk) and software engineering methodologies.

Now I believe they use Haskell rather than Miranda, Turing and Java rather than MODULA-2, and introduce object-oriented design in the first year rather than modules and ADTs. I'm not sure what assembly language they teach, but I'm sure it's not PDP-11!

-- NatPryce
I find this topic of interest, because I didn't receive any formal education in programming. My father was trained in electrical engineering by the military, and as a hobby project built a computer from a kit in the early eighties. The first programming language I learned was Z80 assembler and the second language was BASIC. However, as there was no Internet or CompUSA, we had to order a BASIC ROM from Texas Instruments. The BASIC they provided was very simple and lacked a lot of functionality. So, in the process of learning the language, I extended it to provide rudimentary graphics, sound, and IO processing; of course, all of this was done in Z80 assembler.

Right from the start, I understood the fundamental concepts of a computer and how it worked. I used to draw my own memory maps and hang them on the wall so I could remember how things were structured when I was coding assembler. In those days, it was easy with only 8KB of RAM and a 16KB ROM image. Still, I understood concepts like: pointers, banking, ROM vs RAM, memory shadowing, CPU/bus IO, and more.

I guess I'm old-school, but I think that the great programmers are the ones that embrace the computer as a machine first and then as an abstract entity later. I spend less than 20 per cent of my time looking at or coding assembler today, but I can still do it for a number of processors and I can tell you how a compiler is going to generate instructions for a given processor.

Just think about it: The very concept of a virtual machine is to emulate a physical processor. Java and Smalltalk (and many others products) are based on this concept. How can anyone truly grasp the fundamentals of these languages if they don't understand the underlying concept of the processor they're running on?

Learning to program is a lot like learning to play a musical instrument: Anyone can play some keys on a piano and make "music", but they probably don't understand the underlying theory of the music. My brother is a guitarist, and the more I watch him learn and apply the theory he's laboured to grasp, the more I realize that the fundamentals of any discipline are crucial to applying it skilfully. -- JeffPanici

''"Programmers are the ones that embrace the computer as a machine first and then as an abstract entity later"... I like this sentence and I agree. -- MauroPanigada
I think there are really two questions here: what should be the FreshmansFirstLanguage, and which should be the FreshmansFirstThreeLanguages??

Given that many freshers will never really have programmed before, the FreshmansFirstLanguage must allow the freshers to go from a standing start to easily writing useful (if still ultimately toy) programs in a few weeks; of all the languages I've tried, I think PythonLanguage is the winner here. SmalltalkLanguage is close, but the effort required to learn its extremely rich environment (and the nonstandard, if superior, windowing conventions which most STs have) does not result in a transferable skill; only Smalltalk has the Smalltalk environment, whereas almost every other language has a command-line compiler or interpreter of some sort. Java is also close, but is that little bit more verbose (writing enclosing classes for single-method programs, declaring types); however, it has the big edge that it's widely used in the RealWorld.

However, those pushing CeeLanguage and AssemblyLanguage have a point, too: these languages really make you understand how the machine actually works. Thus, one of these low-down-and-dirty languages should be the FreshmansSecondLanguage?. I would favour asm over C, as it has less magic (malloc and free - too high level!), except that modern RISC machines are a nightmare to write asm for manually (exposed pipelines and all), and the x86 isn't much better. Perhaps students should write asm for Knuth's MmixMachine? That would lose a lot of the point of the exercise, though. On the other hand, C is going to be a lot more useful in future life.

As for the third language, perhaps something really computer-scientific, like ML; they'll need this if they're going to do the usual formal CS stuff.

So, in an ideal world: python (or smalltalk), then assembler for an easy-to-use RISC, then ML. In the real world: java, C, then ML.

-- TomAnderson

While I'll agree that x86 and RISC based assembly languages of today aren't as simple, they are easier in many ways. For example, 32-bit x86 assembly language programming under Win32 is very straightforward. In addition, having 32-bit address registers/addressing modes is a godsend compared to zero-page and/or RAM banking. Coding RISC reminds me of my early days: I use hand-drawn pipeline charts to keep track of what the processor is doing. Sometimes, I even comment the pipelines in the sourcecode. Consider this: one of the first challenges I had when I was coding z80 assembler was how to encode a number larger than 255. Students won't have to worry about this problem today. -- JeffPanici

The main problem with the x86 (not that I'm overly familiar with it) is that it's fairly register-poor; isn't there also something a bit weird about floating-point? And, as you say, RISC chips expose too much of their internals. Still, yes, far better than the olden days! -- ta

Not all modern RISC needs to be so explicit; assemblers could even be able to rearrange instruction properly. Anyway I think that teaching how to deal with numbers larger than the maximum the processor handles without problem is a good thing. This the kind of things that make our minds smarter. -- MauroPanigada
I'm not convinced that learning about pointers and how memory works is as useful as all that. Modern languages like Java, Visual Basic and SQL shield you, to a great extent, from such things. I think the best way for the software development movement as a whole to move forward is to stop concerning itself with low-level details and start building and leveraging more high-level objects.

We can't forget that behind the scenes there must still exist people that manipulates pointers... How would you write a VM? You can use whatever language you like indeed, but the very first time you need to use assembly... ok, don't let us go so back to machine language nor assembly, let's start with C or similar... you need someone who can maintain the C code working... likely your preferred language compiler/interpreter was written in C... Then with the first generation, you could rewrite an interpreter/compiler for X in X itself! Maybe sometimes this approach gives advantages (issues about efficiency?); but still the hardware is that, so there will be a moment where a "low-level" expert (someone who is not scared by pointers) is required to obtain something from a computer. -- MauroPanigada
Low-level details are a part of the subject, so they shouldn't be dropped altogether. Some exposure to them (not having to write lengthy programs in assembly language) at an early stage is a good idea - if you really dislike the experience, you're probably not right for computer studies/science or software design/development anyway. Perhaps it would be useful to show students some specially-prepared short, but fully-commented programs written in low-level languages, together with corresponding modern programs with similar functionality.
I disagree. Low-level details do not need to be part of the subject. Hardly anybody needs to write in assembly language these days - compiler writers are about the only people. It's a waste of time and effort in almost every other circumstance. Modern optimizing compilers can produce excellent code and the days when you really needed to squeeze the last few drops of processor power out of your machine are in the past. Let's move on! Teach new programmers how to break down problems, how to avoid writing complicated code, how to communicate. Don't waste their time teaching them dinosaur skills that are irrelevant to their future.

You don't disagree at all - because I didn't advocate teaching them those skills - merely asking them to understand more, and be aware of the history of the subject.

So they'll be tested on it? Maybe HistoryOfComputing should be an optional subject. I don't see its relevance to the main-stream.

To some extent. For comparison, if someone studies statistics, it's best that they know how to do simple arithmetic, but I'm happy to let them use a calculator most of the time, and in any examinations.

Yep, I've never understood why they still insist on teaching long division in schools! YouAintGonnaNeedIt :)

I think a passing familiarity with assembly and machine architecture is required for programmers to have some idea of the 'physics' of the system they are working in. This is just as important as the understanding of the 'mathematics' which is provided by doing all the formal ML stuff, but not more se. Neither contributes directly to being a good programmer, but they are the foundation for learning that. -- ta

You don't need to know how to grind wheat kernels and cochineal beetles to make tasty pink cupcakes!
For someone who spent several years writing assembly-language programs almost exclusively, the idea of not requiring (or at least offering) a course in assembler seems ridiculous. All of the "low-level" concepts such as memory management, pointers, and stack frames are still important today. As for what a programmer learns from assembler:

How to break down problems: You can't express a complicated concept in assembler. You have no choice but to break it down into tiny little steps.

How to avoid writing complicated code: If you think debugging code in a high-level language is hard, try debugging a few hundred lines of assembler. It's a very good incentive for writing simple code.

How to communicate: Assembler is not self-documenting by nature. Reading and understanding how and why a machine acts the way it does, and communicating the intentions of an otherwise cryptic series of mnemonics, is very educational.

Of course, my first and second languages were BASIC and Pascal, and there's no question that the skills gained with those languages are more useful for everyday work. But assembler (or machine code, whatever) is still underneath everything that happens, and it doesn't seem to be going away. :-) -- CraigPutnam

I was a computer science major as a freshman, and I have to say that our introductory sequence was really effective in teaching me about memory management and good debugging methods. What we did was start with a version of LISP called scheme, implementing a mock assembler, and then moving on to see the same concepts at work in C/C++. Not only did it move on quickly to the "real" languages, but I felt I got a solid foundation in how everything really works before going on to do "real" programming. I'll even brag and say that I felt like it made me a good debugger.
If you have to teach an assembly language (and I don't deny it's a good idea at some point, but I certainly wouldn't start with it), how about the 68k assembler? It's what I was exposed to at the UniversityOfToronto, and it's pretty straightforward. You get 32-bit values easily enough (and even 32-bit addressing, though with a 68000 the high 8 bits might be used for various special purposes and ignored in the actual address deref), but don't have to worry about weird things like speculative computations, bizarre register hierarchies, knowing that the branch instruction has to come one or two before it logically ought to, etc. The main difficulty is adding the (correct) qualifiers to specify byte/word/long data size, and that words have to be aligned.

In a freshman class, it's best to stick to an easy-to-use language with few gotchas.

I've found my share of rather annoying gotchas - and just general things that make it hard to get your code to compile/run - in every language I've had to use (especially C++, but that's another rant). Perl is one of the highest-level languages out there, but just look how long "man perltrap" is. (Okay, maybe "high-level" != "easy-to-use" here. ;) )

Anyway. In university, I saw C first, then C++, then 68k assembly, then Java. This seemed like a reasonable order to me, complementing what I was looking at on my own time (Mostly Perl, but also some unusual things like Postscript, and esoteric things like Befunge).

In a program rather similar to my own, they start with Java and head towards C - don't know if they're required to take any of the "architecture" courses where assembly is taught - which IMHO is an awful way to do things. Worse yet, they teach Java in a non-OO way the first time (I don't even know how this is done! They must have to tell the students that "public class Foo {}" bracketing their file is simply magic that keeps javac happy, or something) and then in third year they get to take a general "programming languages" course which is about the paradigms of languages more than the syntax - so they learn *proper* Java, and introductions to C++ and Scheme (and a week at the end on scripting languages), all at once in a half-term course. Simply hideous.

(I took this course too as an elective; at that point I'd seen C++ in my studies and Java independently, so it wasn't a big shock for me. I'd also read DesignPatterns the previous summer, so I ended up seriously overdesigning the last project of the year. It was a real tour-de-force, but not the best way of doing things as I later came to realize. Still, valuable learning experience.)

-- KarlKnechtel
I've learned to program on my own, in the last three or so years, and am a bit of a language junkie anyway. I'd tell anyone who wants to learn to program to learn Python, Lisp (not Scheme), C, Assembly and... oh, something pretty OO, say Objective C or Smalltalk or Java, in that order. Then I'd point them towards OCaml and Forth and Haskell, if they hadn't had enough yet. The point of languages, the way I see it, is to learn how to think about things differently, and to make more things seem possible. Someone who's taught purely with C and asm won't think of composing and manipulating functions like someone who was raised on Lisp or ML, and someone who only knows Python and Lisp won't think they can make systems programs or compilers or operating systems, because the machine will seem like some frightening black box they can't penetrate.

-- Simon

...because the machine will seem like some frightening black box they can't penetrate.'

Indeed. That's exactly it. You need both ends of the spectrum; from that, you can get anywhere in between, but it's damn hard getting outside of the end-posts once you've been running between them for a while. The danger of no low-level is that you start thinking that running the compiler from a command prompt and twiddling raw bytes in a file is 'low-level'. And it goes both ways. -- WilliamUnderwood
Perhaps introduction to programming should include several language *families*, not merely several languages.

What programming language families are there? Given the wide variety of tasks/problems, and the fact that no single language covers all of them, it would be nice if the combination of all the languages learned covered as many tasks/problems as possible (which is not the same as saying any particular language is widely applicable - I'd rather learn lots of LittleLanguages that cover lots of niches, than a couple of BigLanguages? that overlap almost completely and so end up not covering as many niches.)

As a electronics guy, I'm interested in languages that can express lots of simultaneous operations in parallel - is that what LogicProgramming is about? Do HardwareDescriptionLanguages? like Verilog, VHDL, etc. go in that category?

-- DavidCary

Yes that is what HDL's do. I wouldnt recomend them for 1st language, however; they are somewhat frustrating as their relatively small user base and dominance of a few vendors means that a lot of stuff is very badly supported (simulation tools are not too bad but synthesis tools tend to constrain what is actually used; in our design flow (vhdl), we cannot use record types as several downstream tools can't use them resulting in a lot of copy and paste programming).
I'd start them with Squeak Smalltalk, and have them write a simulation of a (simple) piece of hardware as a term project. My first language, in my first programming course (in 1974), was Fortran - in which the semester assignment was to write a simulator of a PDP-8. Tests consisted of PDP-8 executables supplied by the instructor, where the output of our program was compared to the real thing. I hope the analogy is obvious. -- TomStambaugh
To those who are proposing that pointer arithmetic and computer architecture are important to learn straight away, and thus recommending CeeLanguage and AssemblyLanguage, I would suggest ForthLanguage instead. You get all the advantages of playing with pointer arithmetic, but you also get an interactive environment which speeds learning, testing, and debugging. Forth is very amenable to teaching decomposition of problems, since it is so easy to factor. There is also very little syntax (besides RPN) for the student to learn. One can also extend Forth to explore other paradigms from the bottom up such as ObjectOrientedProgramming and FunctionalProgramming (although then it might be better to learn SmalltalkLanguage and HaskellLanguage). -- IanOsgood
The best first language is obviously Scheme, since it is the language used by the best first textbook: StructureAndInterpretationOfComputerPrograms. -- SmugLispWeenie from CalBerkeley
At one job I was replacing a recent CS grad from a large university. My manager asked me to review his aborted attempt to complete his first real programming assignment. He chose Java. (Yes they really did let use choose whatever language we wanted). Java may have been a decent choice. Myself I went with Perl. His program sucked - no question about that - and didn't work to boot. The amazing thing is that it seemed to go on forever - 20 pages - to solve a simple problem.

This is the real shocker: He actually wrote his own sort method! I guess his CS teachers never mentions that you normally don't really need to write your own datastructors at the application level. Any real programmer would know there HAS to be some sort functions in the API. It's java.util.Arrays.sort(myArray). That's one line - not two days work, buddy. (That's a tuned quicksort by the way). I wrote a Perl program in three days. Hey, I'm not bragging - this honestly wasn't hard stuff. Today, I also know Java and could write it in Java in about four.

In fact he wrote just about everything he needed from scratch... which is why he never really finished anything useful.

This is a real failure of education - I guess it's kind of sad. I went to the same university and have first-hand knowledge of this failure. You can get a degree by writing nothing but toy programs. The classes in the 400 level are basically nothing but more difficult versions of the 100/200 level classes.
"[I]t may be faster to first teach beginners Python and then Java, rather than Java as a first OOPL." --

How could it possibly be slower to learn Java, than to learn Java but first learn something else? -- DavidCary

See TelescopeRule
There's no need to teach kids one specific type of OOP. What they need to know is general reusable concepts in programming, being function calls and the such. Teach 'em SchemeLanguage! - WouterLievens?
a "relevant" language -- (to most college freshmen, that means they can use it in their webpages -- CommonGatewayInterface)

While nearly all programming languages can be "used" on a web page via the CommonGatewayInterface, nearly all of them must run on the web server -- this seems to go against ProgramComputersYouCanUnplug.

Not really, you can run webservers on pretty much any computer, or from a USB stick that you can carry from one computer to another. In fact, I can't imagine doing web development without a localhost environment to experiment on. One where I can try out anything without having to worry about affecting real live websites or other people's accounts. But I doubt a web language should be first. There are so many layers in the stack from the O/S through the servers and the network and protocols, and on to varying compatibilities of client-side browsers, not to mention the basic client-server concepts that it would be a lot of different concepts to learn at once in addition to programming.

Once there were only 2 languages that new programmers "can use it in their webpages" while following the advice of ProgramComputersYouCanUnplug: JavaScript (which no one seems to take seriously as a RealProgrammingLanguage) or JavaLanguage. But now OtherLanguagesForTheJavaVm seems to open things up again.

Has anyone tried using any of the OtherLanguagesForTheJavaVm to teach a first language? Such as JavaPython?
At Tulane University, we did Pascal->C->Scheme, with optional assembler. I thought it was perfect. Pascal developed your debugging and structural skills and safely introduced pointers and structs. Then we moved to a low level and did all the pointer arithmetic and stuff, which is very important. Even a high level language like C# can cause someone problems with int overflow. Would someone without a low level language ever use a .NET stack over an ArrayList? Would they even know what a stack is? No. Anyway we wrapped up with scheme for all the ML stuff, which was wonderful. I'm very glad we didn't cover OOP at all. It would have screwed up my mind.
At Carnegie Mellon University, I would say we have four core sequences of courses, three of which are relevant to this discussion:

Typical Schedule: (The courses can be shifted around easily, and some people take four semesters to finish the core.) After finishing this sequence of courses, I feel one has a very solid foundation. You can easily go from SML and pick up Scheme and then other variants of Lisp. With regards to low-level programming, many students after finishing this sequence of courses go on to program an OS Kernel in 15-410. And, of course, although 15-211 focuses more on high-level algorithms, 15-111 gives a very solid background of OO programming.

-- EdwinShao
At the University of Cambridge, they taught ML as the first language, then Java, and then in the second year there was a course on low level stuff where we used Verilog to design circuits and programmed an ARM chip in assembly. There was also a programming languages course where they taught C, C++ and Prolog assuming the previous knowledge. I know that the teaching of ML first (which supposedly could be someone's first ever programming language) put several people off, but I don't totally disagree with that approach. The first programs they introduced were the usual I/O programs, with recursive and tail-recursive factorial being exercises for the second week. If someone was coming at programming from mathematics, I can see that that might be easier for them than an imperative language. However, nearly everyone who applies for a computer science degree has written a program in an imperative language before, so lots of people got confused and impatient for mutable variables and loops to be introduced.
I just came from a talk by MatthiasFelleisen, co-author of HowToDesignPrograms (HtDP), who has tackled this problem over the last decade, at RiceUniversity, Northeastern University and dozens of other high schools and colleges. As a PltScheme maintainer, he developed DrScheme (which has multiple implementations with progressive disclosure of more advanced language features, a stripped down IDE with a ReadEvalPrintLoop, and error messages tailored to a beginner audience) to introduce the very first programming concepts and provide a gradual learning slope from start to finish. In the first year, he also introduces a subset of JavaLanguage, ProfessorJay?, partly for flexibility and to prep students for a semester of co-op work at local businesses.

However, he emphasizes that the particular language choice is not as important as teaching a good design methodology. In fact, he sees the first CS course as a Liberal Arts class, teaching critical thinking, problem solving, and attention to detail. It is reported that HtDP has helped students in as wide a range of fields as journalism, speech and debate, music, and poetry!

The HtDP design methodology consists of six stages:
  1. problem statement, what data input/output, examples
  2. purpose, contracts, type signature
  3. construct behavior examples
  4. from the data, develop the program organization
  5. write the first code
  6. from the examples, write your test suite

Results of using the HtDP method compared to the previous CeeLanguage intro course required for the engineering tracks: Here are some other salient points from his talk:
If the whole problem is to learn programming in a rather general way, I think the language is not so important after all, it could be an assembly or a object-oriented high level language. If the student learns that "programming" means "to be able to write an algorithm in this specific language", the teacher failed; since the student could not be able to write the same algorithm in every language s/he can learn. Writing good code in a specific language is a matter of experience and pragmatic knowledge of that specific language, I don't see a too much strong relationship with the programming ability; this is something that should be learning despite the choice of the language.

Maybe they exist languages that make the minds of students less stretchy, and languages that make them more stretchy; the former should be avoided, the latter should be taken. But after all which language makes us better or worse programming-minded can depend heavely on our mind, so if, say, C was good for me, it could be not so good for you.

I believe that if a fresh(wo)man understands really the deep meaning of programming a computer, then s/he will be able to do it with any language s/he will spend time with to learn: there are not obstacle (but time and need: at the end if you make it a job, you learn the language that allows you to earn...). (Of course it is better to learn more than one language!)

E.g. I've started with ZX Spectrum BASIC (very young, 8yo or so). Is this a good choice? Could have I started from C? I don't know, but I know that now I think computer languages are not an issue to me (surely I exagerate, there are still languages that I find hard), and I continue to explore languages (Smalltalk is my last entry)... So, Spectrum BASIC is a good choice for a freshman? No. The fact is that I was more focused on understanding how to do things rather than studying the tool to do them: the BASIC was just a language to express a meaning (a tool); I wanted to understand the meaning... done that, I could have (re)written it in any language (BASIC was the one I knew, but it became not a limit; ... I wrote my first C program without having a compiler... to check what the code did I became a C interpreter, according to the K&R manual I had) -- MauroPanigada
My friend had recently enrolled to a course in programming and he came to me for help. I was surprised to see the curriculum start with CeeLanguage, which is as obscure and unsafe as can be; I was astonished when I saw the code snippets of the course. The samples were in breakthrough mode, assuming that every library function succeeds, up to the final { return 0; }. Horrible.
I'm a little surprised that no one has brought up this possibility: Learn a bit of Python or C or Scheme--just enough to get an idea of how to program--then build your own computer, starting with Nand gates, and then building up a machine language, an assembler, a higher-level language, an operating system, and finally a game of some sort. Such a project would tie together all the disciplines of computer science, from architecture to language design to data structures and algorithms to operating systems--and it can be done in a semester, or a year.

At least, according to Noam Nisan and Shimon Schocken, who wrote "The Elements of Computing Systems", and who like to call their class FromNandToTetrisInTwelveSteps.

This class was not available to me when I was in college, working on my Computer Science Minor, so many years ago. Oh, how I wish it were! (I also wish I could have learned Lisp during that time, or at least come across some very good reasons why it is a very important language to learn.)
I think Computer Scientists must know the basic concepts: object-oriented programming (Java's great for learning that), functional programming (PLT Scheme's great for learning that), how the computer works (ASM's great for learning that).

I think Software Engineers must understand how to get things done: Python's great for that! C's great for that! The trick is to use whats appropriate for your given task.

So which should come fist? Does it matter? I don't think so, so long as you understand whats necessary to get the job done. ~TJL

EditText of this page (last edited January 11, 2013) or FindPage with title or text search