A language that doesn’t affect the way you think about programming is not worth knowing. – Alan Perlis
Little correlation has been found between performance and years of experience in the field. Actual productivity is better correlated with the number of programming languages the person knows. - Edward Yourdon
Programming Languages I Have Used
I have used many languages. This is a list of those I can remember.
I first used C++ in 1990. My first impression was that regardless of whether or not this OO stuff was any good, being a type safe version of C was a good thing. Soon I decided that this OO stuff was a good thing. I fell in love with RAII long before I ever heard that awful name. My eyes really opened when I read Barton & Nackman's Scientific and Engineering C++, opened even wider when I read Josuttis' The C++ Standard Library: A Tutorial and Reference, and nearly fell out of my head when I read Alexandrescu's Modern C++ Design: Generic Programming and Design Patterns Applied. This is a language that you can continue to learn your whole life!
Don't get me wrong, C++ is far from perfect. I still think its syntax is ugly (see SPECS for a proposed improvement); It needs a real macro capability, template meta programming is good, but not good enough (unfortunately FOG is not it). Design by Contact is a bear. And I wish functional programming were easier. But when you use good libraries like STL, Boost, Loki, STLsoft, FC++ and LC++, it is a very good language. It supposedly was Stroustrup who said "in C it's fairly easy to shoot yourself in the foot, whereas in C++ it's harder to shoot yourself in the foot, but when you do, you usually blow your whole leg off", so watch out for the gotchas.
I first used C in 1985. C is the Lingua Franca. If you want to call the same function from C++, Java, Python, J, R, & Clips you have to call it through a C interface. C is very dangerous to work in, but there are good tools to solve this problem. Most note worthy is splint.
I have not done a lot with Java. I worked through all of Sun's Java Tutorial in the late 1990's and read several books on the subjects. I have used Java on two contracts in the early 2000's, both time writing installers. One setting up an Axis SOAP server, the other setting up an Oracle Database and a J2EE Server using JMX.
Java has been getting a lot of hype, but it looks like back to the future to me. Am I the only person who remembers the p-System? It was a virtual machine that executed p-Code on any machine that ran the p-System so you could write once and run anywhere in the late 1970's! The p-System died because p-Code was much slower than native code. Java claims to have solved this problem, I do not know whether or not this is true, but I do know that the JVM takes forever to load on every system I've used it on, and it chews up memory.
I am told that Java 'fixes the problems with C++'. But I'm not convinced. It is garbage collected, but memory is only one resource that the programmer needs to manage, garbage collection does not stop you from dropping file handles or other system resources. Using RAII in C++ make managing memory, and any other resource, fairly painless. Besides, I've always believed that if you did not know the life cycle of your objects, you have bigger problems than just resource management. I have also noted that on every large scale Java project I am familiar with a lot of developer effort has been devoted to controlling the garbage collector. To get the dubious advantaged of garbage collection I have to give up RAII, templates, multiple inheritance, conditional compilation, free functions, and link time checking. I'm not convinced. And if I really want garbage collection in C++ I can us the Hans-Boehm conservative garbage collector.
Don't get me wrong, Java has its place: Applets are great; Swing is nice; Reflection can be useful; and being able to replace a component at run time has some appeal. I just have to remember to release all resources inside finally blocks, not too big a price to pay. It will never replace C++, but it is a useful tool to add to the tool box. It still feels like writing C++ with one hand tied behind my back, I'm sure this will change as I get more experience with it.
I have not done much with C# yet. C# is Microsoft's attempt to kill Java, but they are doing it right. While Java is proprietary, C# is an international standard ( Ecma-334). I have not used either one enough to say what make either better than the other, but I have observed that that CRL seems to load a lot faster than the JVM, but you can't deploy portable web applets in it.
J is a modern version of APL. I used APL a little in the early 1980's, and used J a little in the late 1990's and started using it again a few weeks ago.
Like APL, J is a cryptic read only
language, but you do not need a special keyboard like you do for APL.
There is a good reason that it is the language of choice for the
highest scoring users at Project
Euler. Once you get past the steep learning curve, it is very easy
to do serious number crunching in J. J is great for crunching numbers
but not as good at accessing external data. While you can do data
analysis in J, reading the data files is no fun (the command to read
from a file is
1!:1, I told you it was cryptic).
I have not done much with R, just worked through a tutorial and done some simple statistical analysis. I like what I have seen. R is a language for data analysis. It is a powerful language with a plethora of libraries (see CRAN ). The learning curve for R is much less steep than it is for J. The J wiki even has a page explaining how to call J from R to get the best of both worlds.
I've been using Python since the mid 1990's. Python is object oriented and dynamically typed. Many people get hung up on the fact that white space is significant. All I can say is 'get over it'. You indent your code consistently anyway (at least I Hope you do) so what difference does it make if the language requires it? Python code is easy to read and write. It borrows a lot of good ideas from Lisp & Icon. There are a lot of good libraries available. It scales well, you can throw out a simple script quickly, or write a complex application. It is JIT compiled to byte code, it also can run on either the JVM ( Jython) or CRL ( Iron Python) either interpreted or compiled. Python is my scripting language of choice at this time.
Oz supports declarative programming, object-oriented programming,
constraint programming, and concurrency as part of a coherent whole.
The syntax is different from what I'm used to, but not hard to adjust
to. I find the single asignment dataflow variables very interesting. In Oz the bind operator (used to bind a variable to a value or another variable) is symentric so
X = 7 is the same as
7 = X and binds the variable X to the value 7.
I am working my way through Van Roy and Haridi's Concepts, Techniques, and Models of Computer Programming, I recommend it highly. I suspect that in time Oz may rival both C++ and Python as my language of choice.
Clips is not billed as a language, but as an expert system shell. Quoting from the Clips User's Guide:
A program written in CLIPS may consist of rules, facts, and objects. The inference engine decides which rules should be executed and when. A rule-based expert system written in CLIPS is a data-driven program where the facts, and objects if desired, are the data that stimulate execution via the inference engine.So a Clips program is quite a bit different from a program in most other languages. Clips can be used standalone, but more often it is embed in a C program.
ML: Objective Caml, SML, F#, Alice
ML is a polymorphicly typed, strict, impure functional programming language that I have been playing with for a few years. It is strongly, statically typed, but the types are inferred, not declared, meaning the programmer does not tell the system what the types are, the compiler figures it out for itsself. The syntax takes some getting use to, but I like the pattern matching syntax. ML seems to be both safe and powerful, not any easy feat. Most of my experience, and most of the literature, is in imperative programming; and functional programming is significantly different. I suspect that functional programming might be easier for a beginner to learn than empirical programming (frequently writing a functional program is as simple as restating the specification in the syntax of the language), but it is not what I have studied and am use to. I am just starting to learn the Camlp4 pre-processor, it seems to be a great tool.
Haskell is a polymorphicly typed, lazy, purely functional language. Since it is lazy, it does not evaluate function parameters until they are needed. it is pure because it is stateless. I have not done much with it yet, but I am going to because there are several book on functional programming that I want to read that use it.
Assembler is not really one language, but a family of languages, one for each CPU. 6800 Assembler was the first language I learned back in the mid-1970's. Later I learned 6502, Z80, 8086 & 80386 Assembler. Everyone should be required to learn assembler before they are allowed to call themselves a programmer, because no mater what language you write in, it will need to be translated into assembler (well actually machine code, but no need to get technical) before it can run. If you need to program directly to the hardware you need to do it in either C, Forth, or Assembler and only assembler can give you total control. Only Assembler can maximize performance, but this is often over rated because the code is only as good as the programmer and a good optimizing compiler is frequently better than the average assembler programmer.
D is intended to be the successor to C++. Like all the other languages that try to be the next C++, it is garbage collected. I have already expressed my doubts about garbage collection. But, D does offer some real improvements including true modules, closures, lazy evaluation, and design by contract. D does not support multiple inheritance, but I think 90% all the cases where I might use multiple inheritance can be covered by interfaces or mix-ins much more safely. I think I could learn to like D, but I have not used it enough to say for sure yet.
Scala is a very effective blend of functional, imperative and object oriented techniques and has a practical minded design despite supporting very advanced techniques. Scala's type system is based on vObj, a calculus and dependent type system for objects and classes which can have types as members. It runs on the JVM or .NET, I have only used it on the JVM. It definitely offers a higher level of abstraction than Java, but to deploy an applet you need to also deploy the Scala run time system, which is not small, I'm not sure how big a problem that is.
Boo is an object oriented statically typed programming language for the Common Language Infrastructure with a python inspired syntax and a special focus on language and compiler extensibility. #delelop supports Boo out of the box, including a Boo interpreter window. Boo is staticly typed but it also supports 'Duck typing' which basicly allows the type to be tested at run time, not compile time. This seems to give the best of both worlds. There is also a good macro system. I'm still learning Boo, but so far I like it. I hope Boo becomes the langage of choice of .NET development.
I have been required to use Perl for a couple of jobs, once for CGI in the late 1990's and once for scripting a build system in the early 2000's. I do not like Perl. It is almost as ugly as J and does not give me anything I can't get in any number of other languages. There was a time when it might have been worth using Perl to get its regular expressions, but now just about any language offers Perl compatible regular expressions. I do have to admit that CPAN is impressive.
XML Style Language Transformations is used for transforming XML documents. It is an XML application, and it is a Turing complete language. I have only used it for a few simple things and have not started to plumb it's depths. To get a glimpse of the depths see FXSL - the Functional Programming Library for XSLT.
I used Icon a bit in the mid 1980's I liked it, but for some reason I drifted away from it, now most of the features I liked about it can be found in Python. Unicon bills itself as the unified extended dialect of Icon. It looks like it may be worth another look.
LISP: Scheme, Common Lisp
I first encountered Lisp in the mid 1980's when I read Lisp by Winston and Horn. I used XLISP in 1993 when I read Genetic Programming: On the Programming of Computers by Means of Natural Selection by John Koza. Last year I read Hal Abelson's, Jerry Sussman's and Julie Sussman's Structure and Interpretation of Computer Programs which is based on Scheme, if you have not read this book you should. I plan to read Practical Common Lisp by Peter Seibel this year.
Lisp is the second oldest language listed here, older than COBOL, but not as old as Fortran, many people claim it is still the best language around. Its greatest strength is also its greatest weakness, its syntax. Generally I do not think syntax is nearly as important as semantics, but I'm tempted to make an exception for Lisp. The syntax of Lisp is so simple that it is easy to do metaprogamming for Lisp, in Lisp. Therefor you can use the full power of Lisp to write macros for Lisp. It is hard to comprehend how powerful this is unless you have done it. Also, since the syntax is the same for data as for code it is easy to write a program that writes code on the fly. On the flip side the syntax can be infuriating. Many people claim Lisp is an acronym for 'Lost In Stupid Parentheses'. Lisp has a reputation for being inefficient, but with modern implementations this is no longer true. Lisp is also the macro language for emacs, the gimp, and autoCAD.
Designed by Bertrand Meyer and featured in his excellent book Object-Oriented Software Construction, which I recomend to anyone working in any OO language, not just Eiffel. Eiffel is best known for featuring Design By Contract in the core of the language. It has been standardized as ECMA-367 & ISO/IEC DIS 25436
I worked through the Lovelace tutorial in the mid 1990's. Ada has a bad reputation as a language designed by a committee but the fact is that all language standards are written by committees. I have not used Ada much, but it seem to be pretty good. If I were in charge of a very large scale project, I'd seriously consider using Ada.
Prolog (PROgramming in LOGic) is a logic programming language. A prolog program is not composed of instructions to be executed, but of a set of constraints to be solved. My first exposure prolog was in the mid 1980's when I used it in an emergency room triage system. I used it again in the early 1990's to validate data models. I have not used it since, but am thinking of revisiting it because the LC++ library uses a language that is semantical very similar to Prolog, and XPCE, the GUI tool kit that is part of SWI Prolog, looks very interesting.
μC++ extends C++ with advanced control-flow including light-weight concurrency. It supports static multi-level exits from nested loops, enhanced exception handling, coroutines, and concurrent tasks. Unfortunatly it dose not suport MS-Windows at this time. I should look into how dificult it would be to port it. μC++ is featured in Understanding Control Flow with Concurrent Programming using μC++ by Peter A. Buhr, which is avalable in the μC++ web site. I strongly recomend all programmers read this paper, even if you have no interest in learning a new language, just to improve your understanding of control flow.
To the best of my knowledge Fortran is the oldest language still in use today. I first used it in 1979, it has changed a lot since then, but nowhere near as much as BASIC. While many C programmers may have a hard time accepting this, under some circumstances Fortran can be more efficient than C. The reason for this is that due to Fortran's more restrictive semantics the compiler may be better able to take advantage of pipelining and concurrency.
I used PL/I in early 1980's, I've forgotten most of it, but as I recall it was like Pascal on steroids. Anything you wanted to do, there was a built in function to do it. I think the compiler was too big to be ported to a microcomputer. Of course now microcomputers are more powerful than the mainframe was them, but PL/1 was dead before that happened.
It is hard to say anything meaningful about Basic. I learned BASIC in the late 1970's, at that time BASIC was an unstructured language, all flow control was done with goto's. When I used it again in the late 1990's it was a structured language and I did not use a single goto. The BASIC I used in the 1990's was like an untyped Pascal (I know the phrase 'untyped Pascal' is heresy, but I can't think of a better way of describing it.) Recently I have only used BASIC as the macro language for Microsoft Office tools (Word, Excel and Access).
I first used Pascal in the early 1980's, and last used Delphi in the late 1990's. Pascal is a nice, under rated language. It might have become more popular than it is if Wirth had not tried to cripple it. Like many students in the 1980's, I wrote a Pascal compiler as a class project.
Wirth did not want Pascal to be extended to make it a useful language to do real work in. He intended it to remain a toy language for teaching and Modula 2 was to be the language do do real work in. When I learned Modula 2, in the mid 1980's, I was used to Pascal, and hated the fact that Modula 2 was case sensitive. Now I dislike languages that are not case sensitive.
AOP is a great idea, but these implementations of it are seriously flawed. The all suffer from the same non-locality problem as exhibited by Intercal's come from statement. For more information read 'AOP Considered Harmful' by Constantinos Constantinides, Therapon Skotiniotis & Maximilian Stoerzer. I highly recommend you read Clarke & Baniassad's Aspect-Oriented Analysis and Design: The Theme Approach and use it's ideas in your analysis and design but do the implementation in an OOP language until a better AOP is developed. To see some ways to do this refer to Christopher Diggins' Aspect-Oriented Programming & C++ and the tutorial entitled Aspect-Oriented Programming with C++ and AspectC++ which can be found at the AspectC++ web site.
Cmm and Frost are two different extensions to C++ that add multiple dispatch. C++ support single dispatch, meaning that the method called can be determined at run time based on the type of the object the method is invoked on. Multiple dispatch extends this to also allow selection based on the types of the parameters. This eliminates the need of visitor patterns. Cmm has a few additional features, the most useful are embedded functions, allowing whole function definitions inside expressions, and reflection. It also allows block structure from indentation, like python, and an alternative declaration syntax.
MatLab was developed as a substitute for Fortran that would be easier for students to use. I used it on several projects in the mid to late 1990's and early 2000's. Scilab and Octave are open source clones of Matlab, each slightly different.
Euler is similar to Matlab, but not a clone. One nice feature is that you can embed yacas code in it.
Yacas (Yet Another Computer Algebra System) is, as the name suggests, a computer algebra system. It is used to do symbolic math.
Maple V is a language for doing symbolic math. I first used Maple V in the early 1990's and have not used it extensively.
I used Mathmatica briefly in the early to mid 1990's. It is a powerful computer algebra systems. At that time it was a real memory hog and brought my PC to its knees.
Another mathematical language which distinguishes itself by its support of dimensionality and units.
Forth is a very small language, it has even be written into the microcode of some CPU's. I first heard of Forth when I heard that the controllers used to control the cameras and models to create the special effects in the first star wars movie were programmed in Forth. The best thing about Forth is Leo Brodie's Starting Forth which is arguably the best programming language book ever written! I learned Forth in the early 1980's. I even wrote a simple version of Forth in Z80 assembler.
COBOL (COmmon Business-Oriented Language) is the third oldest language on this page (after FORTRAN and LISP) and there are probably more COBOL programs running at this moment that FORTRAN and LISP combined. What gives COBOL its staying power? I don't know, but I suspect decimal arithmetic is part of the answer. I think COBOL is the only language listed here that does decimal arithmetic out of the box (there are some good C++ class libraries for decimal arithmetic, but they are not standard).
I read somewhere that there are more lines of Rexx code in existence than there are of COBOL, that is quite a claim. I used it as the primary scripting language for OS/2 and as the macro language for the Kedit editor. I have not used Rexx Since I gave up both OS/2 and Kedit in the late 1990's.
I used dBase in the mid 1980's and Clipper in the early 1990's. It is a good environment for small to mid sized database applications. I thought that it was dead, but I see that dBase is still available and that there are three open source Clipper compilers available. I'll keep them in mind next time I need to created a small database application.
SQL fails to properly capture the relational data model as expresses in relational calculus, but it is ubiquitous. If you are going to develop databases you need to know SQL (you should also know relational calculus). I've been using SQL since 1990.
Suneido has some good ideas. It looks like a well designed language that captures the relational model better than SQL. But I have a big problem with it. The program is stored in the database file! This causes several problems. First, you need to write the program in the IDE, you are not free to use the editor of your choice. This is inconvenient, but I can live with it. The more important problems are that since all the code lives in the same file it is difficult for a team to work on it and version control is a nightmare. The application, test data, and unit tests all are in the same file. I'll let you imagine the trouble that can cause. Deploying an update to an existing system with live data in the database is not easy since you need to replace part of the file, but not other parts. All in all while there are many good things to be said for this language, it is far more trouble than it is worth.
I worked with PC Focus in the late 1980's. It was a hierarchical database language. I do not know whether it still exists.
Many programmers are not even aware that PostScript is a full blown programming language, it is a dialect of FORTH. If you have never programmed in PostScript you may want to try it.
Tcl (Tool Command Language) syntax resembles Lisp, but its semantics are no where near as expressive as Lisp. I think the only reasons anyone ever used tcl was for tk and expect. You can now use tk from almost any language, But while there have been two attempts to create a version of expect for python, neither one ever worked properly to my knowledge, so you still need to learn tcl if want to use expect.
ISETL (Interactive SET Language) is an interpreted mathematical programming language closely resembling the language of sets and functions used by Mathematicians. I used ISETL when I was working my way through Learning Discreete Mathematics with ISETL by Baxter, Dubinsky & Levin some time in the 1990's.
SNOBOL stands for StriNg Oriented symBOlic Language, is a language for text processing and pattern matching. I used SNOBOL briefly in the early 1980's. It had some nice features, but its lack of high level control structures turned me off and I quickly moved to Icon.
No list of programming languages would be complete without Intercal. INTERCAL is purposely different from all other computer languages. Based on the well-known and oft-demonstrated fact that a person whose work is incomprehensible is held in high esteem, intercal is designed to make any program incomprehensible. (I sometimes think Perl and J are based on the same philosophy). Someday I'm going to write a useful program in Intercal, just to say I did. I'm sure developing something complex in intercal would change the way you think.
I have also used MS-DOS batch language, Unix Shell scripts, lex, yacc, and the macro languages for many applications including MS-Word 2, Word Perfect, and the Brief & VIM editors.
There are several Languages I have not used yet, but interest me:
- Ruby looks like Perl done right. I first looked at Ruby many years ago, but at that time all the documentation was in Japanese. Now there is plenty of English documentation, so It is time to give it a second look.
- Q is a functional programming language based on term rewriting.
- A functional language that supports fine grained concurrency, transparent distribution, fault detection, persistence, and hot code replacement. The YAWS web server and ErlyWeb web development framework look very promising for developing web services.
- Nemerle is a high-level statically-typed programming language for the .NET platform. It offers functional, object-oriented and imperative features. It has a simple C#-like syntax and a powerful meta-programming system.
- Lua is a powerful light-weight programming language. It combines simple procedural syntax with powerful data description and extensible semantics.
- Io is a small, prototype-based programming language.
- Joy is a functional programming language which is not based on the application of functions to arguments but on the composition of functions.
- Onyx is a powerful stack-based, multi-threaded, interpreted, general purpose programming language similar to PostScript.
- Mercury is a logic/functional programming language, which combines the clarity and expressiveness of declarative programming with advanced static analysis and error detection features. Its highly optimized execution algorithm delivers efficiency close to conventional programming systems. Mercury addresses the problems of large-scale program development, allowing modularity, separate compilation, and numerous optimization/time trade-offs.
- FLORA-2 is an advanced object-oriented knowledge base language and application development environment. The language of FLORA-2 is a dialect of F-logic with numerous extensions, including meta-programming in the style of HiLog and logical updates in the style of Transaction Logic. FLORA-2 was designed with extensibility and flexibility in mind, and it provides strong support for modular software design through its unique feature of dynamic modules.
- Dylan (DYnamic LANguage) is a multi-paradigm language that includes support for functional and object-oriented programming, and is dynamic and reflective while providing a programming model designed to support efficient machine code generation.
- Factor is a dynamically typed, stack-based programming language.
- Reading the creator's blog, it looks like he and I have similar views of we want in a programming language. It looks like Heron may be my idea of a replacement for C++.
- Felix is an advanced Algol like procedural programming language with a strong functional subsystem. It features ML style static typing, first class functions, pattern matching, garbage collection, polymorphism, and has built in support for high performance microthreading, regular expressions and context free parsing.
- The Cat language is a pure functional language, inspired by Joy. All constructs in Cat behave as functions which takes a single stack as input and returns a new stack. Cat not only has no variable declaration, there are no argument declarations.
- Sather is an object oriented language designed to be simple, efficient, safe, flexible and non-proprietary. One way of placing it in the "space of languages" is to say that it aims to be as efficient as C, C++, or Fortran, as elegant as and safer than Eiffel, and support higher-order functions and iteration abstraction as well as Common Lisp, CLU or Scheme.
- Squeak is a popular dialect of SmallTalk. The IDE looks a bit strange. I find Croquet particularly interesting.
- StrongTalk is a strong static typed dialect of SmallTalk.
- Stratego is a modular language for the specification of fully automatic program transformation systems based on the paradigm of rewriting strategies.
- Nice extends the ideas behind object-orientation in order to better support modular programming and static type safety. It also incorporates features from functional programming. It is one of the few languages that support multi-methods
- Tela (TEnsor LAnguage) is a scientific computing language and environment. It is mainly targeted for prototyping large-scale numerical simulations and doing pre- and post-processing for them.
- Yorick is an interpreted programming language, designed for postprocessing or steering large scientific simulation codes. The language features a compact syntax for many common array operations, so it processes large arrays of numbers very efficiently. Superficially, yorick code resembles C code, but yorick variables are never explicitly declared and have a dynamic scoping similar to many Lisp dialects.
- PARI/GP is a specialized computer algebra system, primarily aimed at number theorists, and for speed.
- GAP (Groups, Algorithms and Programming) is a system for computational discrete algebra, with particular emphasis on Computational Group Theory.
- Lush is an object-oriented programming language for large-scale numerical and graphic applications.
- Glee, like J, is derived from APL. it looks like it handles files and string much better than J.
- NIAL, which stands for the Nested Interactive Array Language, is also derived from APL. It claims to combine some of the best features of functional and procedural programming.
- Limbo is the application programming language for Inferno. Like Java, Limbo is compiled to architecture-independent object code which is then interpreted by the Dis virtual machine or JIT compiled. Communication channel is a primitive data type in Limbo.
So the question many people want me to answer here will be 'What is the best language?' but that is a question that has no answer. Every language represents trade offs, as a results each language has both strengths and weaknesses. With the possible exception of Intercal, there is probably some task for which any language you chose will be best suited for (if you need to write a program to print the full lyrics to the song 99 Bottles of Beer on the Wall then HQ9+ might be a good choice, even though it is not Turing Complete).
The secret of successful program design is to work at the highest level of abstraction possible at all times. For this reason you should always work with the language which offers the best high level abstraction of the problem domain. If the problem is mathematical in nature use J, R, Yacas or another mathematical language. If the problem involves manipulating relational data tables use SQL, Suneido, or xBase. If the problem involves graphic layout use PostScript. If the problem is to directly control computer hardware you will need to use C, Forth or Assembler. If you are developing an Expert system you should use Clips. If you are solving a logic problem use Prolog, Mercury, or Oz. In most cases there will not be an existing language which embodies the necessary abstraction so you will need to develop your own abstraction in the language you chose, so choose a language which will make developing an abstraction easy, such as C++, Oz, Scala, Nice, Heron, Boo, Nemerle, Unicon, etc.
I recommend that all programmers learn C, C++, Assembler, and some dialect of Lisp. I do not recommend C or C++ as a first language, and a strong case can be made against Assembler or Lisp as a first language, although Assembler was my first language, and many people recommend Scheme as a first language. Pascal is a good first language, it was designed for teaching. I have heard good things about Euphoria as a first language, but have no experience with it. A dilect of ML might be a good first language, but I'm not sure.