Right, but, that's all that you have to learn re: grammar and syntax in LISP/Scheme.
Yeah, but it looks intimidating even to me, with my experience coding. And already having tried Racket.
I'm not anti-LISP or anything, Tom says I'm a born LISPer. (I don't agree, but that's nice). Parentheses aside, LISP inspired Logo and is about 6 years older. (BASIC and Logo are both from the early to mid 1960s). It's a very important language!
If I had a use for LISP that justified the following, the parentheses might not be a dealbreaker. One of the great taboos in Python-land is braces instead of whitespace, so naturally several people have written a way to convert code written with braces to Python whitespace. You can do it in less than a day. I wrote one for fun and then found the other versions.
So if it were just about parentheses, that's fixable. More about this in a moment.
Compare that with any other language (including the language I use most, C). Most other languages have a lot of syntactic and grammatical cruft that isn't needed.
Full Agreement! At least on the statement by itself. So many languages do things I think are noisy and distracting. Too much syntax. Python prided itself on staying away from this (within reason) and then got increasingly complex when Google started throwing money at it.
I understand they wanted features, but they started deprecating some of the simpler, easier things. It would be nicer if they could just add stuff and leave the good stuff alone.
Differences of opinion are bound to happen here-- ours or someone else's, fortunately I have not met too many Python purists in BSD land. Python made a tool I love to use. Its other agendas do nothing for me. PyPy fixes this for me, though BSD uses Python 2 to build PyPy (there's easy-to-understand logic involved, but it is a bit of a Catch 22 for PyPy on BSD, I hope it gets resolved-- I've talked to the maintainers in FreeBSD and OpenBSD already).
Then they have this (artificial) difference between statements and expressions. And, this even more fundamental split between code and data (which, even at the hardware-level is the same).
I've used the shell and I've used Tcl. In both instances it's pretty neat how (I think M4 is another one) code can produce data that is then run as code. I frequently use $()
command substitution.
However, there are so many times (longer code I want to run more than a couple times) where I really want to turn that off, actually by default. Because sometimes data is data, and you DO NOT want it treated as code. And escaping everything is irritating, and single quotes are far from a perfect solution.
Python is far better behaved in this regard. Which is to say, boring by comparison. I like languages on both sides of this argument, for different things. But for beginners I like strings to be as simple as possible: data has clear demarcation.
I think most languages go too far in terms of how fancy the get with syntax though. The number of rules should be kept small, but Perl does keep them smaller.
Originally I only allowed "strings" and #hashes but it was harder to read. I added :
as an optional line separator, then I got silly and added =
(
)
;
and |
. This way, you can add punctuation that is similar (occasionally identical) to other languages, for example most lines work like a pipeline (simpler than Bash or at least more consistent) and you can put | between "lines" to help explain Bash (or sh or ksh-- UNIX, since late 1973) pipelines:
echo early 1973 pipes > fold -sw 10 > tr a-z A-z >
echo late 1973 pipes | fold -sw 10 | tr a-z A-z
You can thank Ken Thompson for the latter (he also implemented the former, but it wasn't his idea to add them).
Again, LISP/Scheme makes composition simple (lambdas) and it is extremely orthogonal.
It is. But while LISPers love the consistency of LISP, and I love languages with few rules, for me LISP is so consistent that I feel like I'm always losing my place in the code. There aren't enough "landmarks" or patterns. This is not a problem for LISPers.
My thoughts on this are that this sort of flexibily in syntax is irrelevant to learning programming.
It's not, if the result is overwhelming and difficult. BASIC and Python have lots of parentheses but you can avoid them more easily (more often) than in Lisp, and this makes the language "less like equations" to people who think it's just a bunch of equations.
My favourite thing about coding is that it can do math though it's easier. Throw a giant equation on the board and I'll hide-- cowardly but true. Lots of people are like that. Nest a bunch of function calls and tell people "this is how you write code" and they'll often hide too.
But pipelines aren't so bad. You can chain stuff together in a left-to-right sequence, which is much closer (in the way people think) to human language, although human language does alternate confusingly between infix, prefix and postfix for things like adjectives and verbs. I stick with prefix notation for its broad appeal and familiarity and throw away everything I'm happy to avoid.
The result isn't as rigidly consistend as Lisp but there's a balance and it leans towards simple/rigid consistency.
and more importantly, very naturally learn and use things like lists and trees which are considered "advanced" in other languages. Python is OK, too, in this regard.
shopping "" arr times 20 ; arrset 1 "milk"
shopping "milk" arr plus "eggs"
shopping "milk eggs" split shopping " "
That's three ways to make an array. I wouldn't make three ways to create an array just to have three ways, I prefer to keep it simple. But the first way converts a string (or a numeric / float) to a list, then adds 19 copies to make a 20-item list. Then it sets the first item to "milk". I know you won't approve of arrays being 1-based, most people won't.
Here's a fun bit of trivia-- every time there's a new cardinal decade, pedants scoff because they think the decade doesn't start until the following year. They're both right-- the people who started celebrating the new millennium in 2000 are going by cardinal decades and cardinal centuries, etc. while the people scoffing are going by ordinal decades and so on.
It makes sense from a CS standpoint for indexes to be 0-based, but it's not "natural" to most people who count from 1. From a CS standpoint, it's probably more reasonable and logical to align indexes to 0. Most languages do. For a beginner who is creating a substring or slice of an array however, "1" corresponds to "1st" and "2" corresponds to "2nd", etc. If I'm thinking of something vertical like a long list, I don't care if indexes are 0-based or 1-based. But if I'm thinking of something that goes left to right (like English) then I'm still more comfortable thinking of that as 1-based.
Python is more consistent, but all the beginner does is forget to subtract a lot. It's definitely good practice, but it can wait. I've known the difference for 30 years, I've coded in Python for more than a decade and I STILL find 1-indexed substrings more natural and intuitive. But I figure thats going to be a dealbreaker for a lot of people eager to teach 0-indexing, so if someone was working on this with me and said "ok, but knock it off with the 1-indexing if you want our support" I'd drop it in a second for the "official" version. For graphics, screens and sprites tend to be 0, 0 based almost universally, even in beginner-friendly languages.
I've never designed a programming language, and I wouldn't want to--I look at languages like Racket or Python and then say, "Eh. These are far, far better than my own tinkerings."
That's why inline Python is such a fun cheat. The goal is to make a language where you don't need it, but it's extensible. Several BASIC dialects have had features that let you extend the capability using snippets of machine language or other routines.
I made certain my language does a lot, but part of the design is to have about 100 (or fewer) functions, and by the time that feels like not enough-- honestly, explore and find yourself a second language. Or, use inline Python. Or learn enough Python (just snippets will do) to actually Add Features to the language itself.
It's pretty easy to do. I never added a native feature until I had gotten tired of using inline Python to do the same thing repeatedly, so this was a great way of prioritising what features to add once I had the most necessary ones. A first time coder really doesn't need more than 100 commands. If that's too small, you're ready for Python or Javascript or sh or something! You've "graduated" first-time coding and are definitely ready for a "real" language. But more importantly, by the time you learn it you already understand the basics of coding. And the differences between this language and Python aren't great differences. They translate very well.
Market pressures. Gotta train the next generation of web pogrammers, right?
I figure, but it's a shame for them to focus on that because Python is good at a lot of things besides web development.
but, I also think that at each stage in the language design process, you had better produce a grammar for it and then pass it through an actual parser (or, better use a parser-generator like yacc/bison). Otherwise, you could end up with a language which is very difficult to parse or has all sorts of hidden ambiguities.
Parser generators are vastly overrated. They have all the advantages you think they do, and from a computer science perspective those are real advantages. There are certain things (like concurrency or type safety) that are likely to improve if you do things that way.
However, this is overstated as a requirement. Handwritten parsers are common enough and even at Brown, Krishnamurthi pokes at the assumption that to create a language you need a "compiler compiler". People use them, they have advantages, so people start insisting that they're required due to cargo cult logic. But it's important to mention that they largely accomplish what you say they do, and that those are legitimate reasons to use one. Personally, I simply don't want to use BNF or anything like it. Are there costs? Yes. Do they matter to the design and purpose of what I'm doing. Not really! Does an intro to Computer Languages (design) teach you how to handcode a parser? Yes. Should you always do that? Probably not, depending.
I intend to try your code, and I'm very confident it won't do anything naughty, but I'll feel better running it in a restricted environment. It may not do anything naughty (almost certainly not) but if it did, I couldn't tell you. I spotted the rot13 comment though, that's a pretty big hint. I actually did one of those (the thing the comment suggests the program does) in my own language fairly early on. I designed it strictly based on the LED one I picked up at a thrift shop for few dollars.