Computer Language Primer - Part 1 - Comments

In summary, this article provides a good overview of computer language basics. Every language discussion thread - 'what language should I use' should have a reference back to this article. The article includes a history of computer languages, as well as a discussion of self-modifying code and event driven programming.
  • #1
phinds
Science Advisor
Insights Author
Gold Member
18,853
13,803
phinds submitted a new PF Insights post

Computer Langauge Primer - Part 1

computerlanguages.png


Continue reading the Original PF Insights Post.
 
  • Like
Likes rootone, blue_leaf77, stoomart and 5 others
Technology news on Phys.org
  • #2
Very well done. Every language discussion thread - 'what language should I use' should have a reference back this article. Too many statements in those threads are off target. Because posters have no clue about origins.

Typo in the Markup language section: "Markup language are"

Thanks for a good article.
 
  • #3
@phinds , thanks for the trip down memory lane. That was fun to read.

I too started in the machine language era. My first big project was a training simulator (think of a flight simulator) done in binary on a computer that had no keyboard, no printer. All coding and debugging was done in binary with those lights and switches. I had to learn to read floating point numbers in binary.

Then there was the joy of the most powerful debugging technique of all. Namely, the hex dump (or octal dump) of the entire memory printed on paper. That captured the full state of the code & data and there was no bug that could not be found if you just spent enough tedium to find it.

But even that paled compared the to generation just before my time. They had to work on "stripped program" machines. Instructions were read from the drum one-at-a-time and executed. To do a branch might mean (worst case) waiting for one full revolution of the drum before the next instruction could be executed. To program them, you not only had to decide what the next instruction would be, but also where on the drum it was stored. Choices made an enormous difference in speed of execution. My boss did a complete boiler control system on such a computer that had only 3x24 bit registers, and zero RAM memory. And he had stories about the generation before him that programmed the IBM 650 using plugboards.

In boating we say, "No matter how big your boat, someone else has a bigger one." In this field we can say, "No matter how crusty your curmudgeon credentials, there's always an older crustier guy somewhere."
 
  • Like
Likes stoomart and phinds
  • #4
Thanks Jim. Several people gave me some feedback and @Mark44 went through it line by line and found lots of my typos and poor grammar. The one you found is one I snuck in after he looked at it :smile:
 
  • #5
anorlunda said:
@phinds , thanks for the trip down memory lane.
Oh, I didn't even get started on the early days. No mention of punched card decks, teletypes, paper tape machines and huge clean rooms with white-coated machine operators to say nothing of ROM burners for writing your own BIOS in the early PC days, and on and on. I could have done a LONG trip down memory lane without really telling anyone much of any practical use for today's world, but I resisted the urge :smile:

My favorite "log cabin story" is this: In one of my early mini-computer jobs, well after I had started working on mainframes, it was a paper tape I/O machine. To do a full cycle, you had to (and I'm probably leaving out a step or two, and ALL of this is using a very slow paper tape machine, taking a full afternoon easily for these steps) load the editor from paper tape, use it to load your source paper tape, use the teletype to do the edit, output a new source tape, load the assembler, use it to load the modified source tape and output an object tape, load the loader, use it to load the object tape, run the program, realize you had made another code mistake, go out and get drunk.
 
Last edited:
  • Like
Likes DrClaude
  • #6
Mainframe assembler's included fairly advanced macro capability going back to the 1960s'. In the case of IBM mainframes, there were macro functions that operated on IBM database types, such as ISAM (index sequential access method), which was part of the reason for the strange mix of assembly and Cobol on the same programs, which due to legacy issues, still exists somewhat today. Microsoft assemblers and other assemblers for mini / micro computers had/have macros, and MASM 6.11 includes some higher level language concepts with dot directives like .if .else .endif .while ... .

Self-modifying code - this was utilized on some older computers, like the CDC 3000 series which included a store instruction that only modified the address field of another instruction, essentially turning an instruction into an instruction + modifiable pointer. IBM 360 type mainframes use an instruction similar in concept "EX" (execute) to override an otherwise fixed operand field on the next instruction, such as changing the number of bytes to move on a move character instruction (MVC).

Event driven programming is part of most pre-emptive operating systems and applications, and time sharing systems / applications, again dating back to the 1960s.

Another advancement in programming were tool sets that generated code. Prototyper for the Macintosh was an early example. The developer would design a user interface using drag and drop based tool set, and Prototyper would generate the code, where the developer would then add code to in order to create an application. Visual Studio includes this feature, and in the case of Visual Basic, includes the ability to generate code for charts and graphs.
 
Last edited:
  • #7
Yes, there are a TON of such fairly obscure points that I could have brought in, but the article was too long as is and that level of detail was just out of scope.
 
  • #9
Greg Bernhardt said:
Great part 1 phinds!
AAACCCKKKKK ! That reminds me that now I have to write part 2. Damn !
 
  • Like
Likes Greg Bernhardt
  • #10
phinds said:
AAACCCKKKKK !
I thought it was spelled ACK...

(As opposed to NAK)
 
  • Like
Likes jim mcnamara and DrClaude
  • #11
Very well done, thanks phinds for this insight! Judging from this part 1, it gives a very good general picture touching upon many important things.
 
  • Like
Likes phinds
  • #12
Mark44 said:
I thought it was spelled ACK...

(As opposed to NAK)
No, that's a minor ACK. MIne was a heartfelt, major ACK, generally written as AAACCCKKKKK !
 
  • #13
phinds said:
AAACCCKKKKK ! That reminds me that now I have to write part 2.
No choice.
You have sent an ASCII 06.
Full Duplex mode?
ASCII 07 will get the attention for part 2.
 
  • #14
256bits said:
No choice.
You have sent an ASCII 06.
Full Duplex mode?
ASCII 07 will get the attention for part 2.
Yeah but part 2 is likely to be the alternate interpretation of the acronym for ASCII 08
 
  • #15
Some comments:

No mention of plugboard programming.

http://en.wikipedia.org/wiki/Plugboard

"8080 instruction guide (the CPU on which the first IBM PCs were based)." The first IBM PC's were based on 8088, same instruction set as 8086, but only an 8 bit data bus. The Intel 8080, 8085, and Zilog Z80 were popular in the pre-PC based systems, such as S-100 bus systems, Altair 8000, TRS (Tandy Radio Shack) 80, ..., mostly running CP/M (Altair also had it's own DOS). There was some overlap as the PC's didn't sell well until the XT and later AT.

APL and Basic are interpretive languages. The section title is high level language, so perhaps it could mention these languages include compiled and interpretive languages.

RPG and RPG II high level languages, popular for a while for conversion from plugboard based systems into standard computer systems. Similar to plugboard programming, input to output field operations were described, but there was no determined ordering of those operations. In theory, these operations could be performed in parallel.
 
  • #16
rcgldr said:
"8080 instruction guide (the CPU on which the first IBM PCs were based)." The first IBM PC's were based on 8088, same instruction set as 8086, but only an 8 bit data bus. The Intel 8080, 8085, and Zilog Z80 were popular in the pre-PC based systems, such as S-100 bus systems, ...
Nuts. You are right of course. I spent so much time programming the 8080 on CPM systems that I forgot that IBM went with the 8088. I'll make a change. Thanks.
 
  • #17
rcgldr said:
No mention of plugboard programming.
The number of things that I COULD have brought in, that have little or no relevance to modern computing, would have swamped the whole article.
 
  • #18
This looks like an interesting topic. I hope there will be discussion about which languages are most widely used in the math and science community, since this is after all a physics forum.

One thing I noticed is that although you mention LISP, you did not mention the topic of languages for artificial intelligence. I did not see any mention of Prolog, which was the main language for the famous 5th Generation Project in Japan.

It could also be useful to discuss functional programming languages or functional programming techniques in general.

I think it would be interesting to see the latest figures on which are the most popular languages, and why they are so popular. The last time I looked the top three were Java, C, and C++. But that's just one survey, and no doubt some surveys come up with a different result.

Since you mention object-oriented programming and C++, how about also mentioning Simula, the language that started it all, and Smalltalk, which took OOP to what some consider an absurd level.

Finally, I do not see any mention of Pascal, Modula, and Oberon. The work on this family of languages by Prof. Wirth is one of the greatest accomplishments in the history of computer languages.

In any case, I look forward to the discussion.
 
  • #19
phinds said:
Yes, there are a TON of such fairly obscure points that I could have brought in, but the article was too long as is and that level of detail was just out of scope.

I think the level you kept it at was excellent. Not easy to do.

Meanwhile I have a couple of possibly dumb questions about following Insights articles; I will ask them here since Insights seems to be separate from the main forum & I didn't find any help articles in the sitemap for Insights.

1) How in heck does one "vote" on an article? I see at the top of the page "You must sign into vote"; but (a) I am already signed into the forum, and (b) clicking Register just brings me back to the forum, so (c) eh??

2) I wish there were a way built into Insights to bookmark or "favorite" articles, just as one can watch forum threads. I can follow the comments thread for an article, which is almost as good; but "favoriting" articles would be a nice feature.
 
  • #20
Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.
 
  • Like
Likes jedishrfu
  • #21
Nice article, @phinds!

I would like to point out that some interpreted languages, such as MATLAB, have move to the JIT (just-in-time) model, where some parts are compiled instead of simply being interpreted.

Also, Fortran is still in use not only because of legacy code. First, there are older physicists like me who never got the hang of C++ or python. Second, many physical problems are more simply translated to Fortran than other compiled languages, making development faster.
 
  • Like
Likes jedishrfu
  • #22
stevendaryl said:
Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.

Others more knowledgeable than me will no doubt reply, but I get the sense that scripting languages are a subset of interpreted. So something like Python gets called both, but Java is only interpreted (compiled into bytecode, as Python is also) and not scripted.

https://en.wikipedia.org/wiki/Scripting_language
 
  • #23
Well done Phinds!

One word of clarification on the history of markup is that while HTML is considered to be the first markup language it was in fact adapted from the SGML(1981-1986) standard of Charles Goldfarb by Sir Tim Berners-Lee:

https://en.wikipedia.org/wiki/Standard_Generalized_Markup_Language

And the SGML (1981-1986) standard was in fact an outgrowth of GML(1969) found in an IBM product called Bookmaster, again developed by Charles Goldfarb who was trying to make it easier to use SCRIPT(1968) a lower level document formatting language:

https://en.wikipedia.org/wiki/IBM_Generalized_Markup_Language

https://en.wikipedia.org/wiki/SCRIPT_(markup)

in between the time of GML(1969) and SGML(1981-1986), Brian Reid developed SCRIBE(1980) for his doctoral dissertation and both SCRIBE(1980) and SGML(1981-1986) were presented at the same conference (1981). Scribe is considered to be the first to separate presentation from content which is the basis of markup:

https://en.wikipedia.org/wiki/Scribe_(markup_language)

and then in 1981, Richard Stallman developed TEXINFO(1981) because SCRIBE(1980) became a proprietary language:

https://en.wikipedia.org/wiki/Texinfo

these early markup languages , GML(1969), SGML(1981), SCRIBE(1980) and TEXINFO were the first to separate presentation from content:

Before that there were the page formatting language of SCRIPT(1968) and SCRIPT’s predecessor TYPSET/RUNOFF (1964):

https://en.wikipedia.org/wiki/TYPSET_and_RUNOFF

Runoof was so named from "I'll run off a copy for you."

All of these languages derived from printer control codes (1958?):

https://en.wikipedia.org/wiki/ASA_carriage_control_characters https://en.wikipedia.org/wiki/IBM_Machine_Code_Printer_Control_Characters

So basically the evolution was:
- program controlled printer control (1958)
- report formatting via Runoff (1964)
- higher level page formatting macros Script (1968)
- intent based document formatting GML (1969)
- separation of presentation from content via SCRIBE(1981)
- standardized document formatting SGML (1981 finalized 1986)
- web document formatting HTML (1993)
- structured data formatting XML (1996)
- markdown style John Gruber and Aaron Schwartz (2004)

https://en.wikipedia.org/wiki/Comparison_of_document_markup_languages

and back to pencil and paper...

A Printer Code Story
------------------------

Lastly, the printer codes were always an embarrassing nightmare for a newbie Fortran programmer who would write throughly elegant program that generated a table of numbers and columnized them to save paper only to find he’s printed a 1 in column 1 and receives a box or two of fanfold paper with a note from the printer operator not to do it again.

I’m sure I’ve left some history out here.

- Jedi
 
Last edited:
  • #24
David Reeves said:
One thing I noticed is that although you mention LISP, you did not mention the topic of languages for artificial intelligence. I did not see any mention of Prolog, which was the main language for the famous 5th Generation Project in Japan.
This was NOT intended as a thoroughly exhaustive discourse. If you look at the wikipedia list of languages you'll see that I left out more than I put in but that was deliberate.

It could also be useful to discuss functional programming languages or functional programming techniques in general.
And yes I could have written thousands of pages on all aspects of computing. I chose not to.

Since you mention object-oriented programming and C++, how about also mentioning Simula, the language that started it all, and Smalltalk, which took OOP to what some consider an absurd level.
See above

Finally, I do not see any mention of Pascal, Modula, and Oberon. The work on this family of languages by Prof. Wirth is one of the greatest accomplishments in the history of computer languages.
Pascal is listed but not discussed. See above
stevendaryl said:
Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.
Basically, I think most people see "scripting" in two ways. First is, for example, BASIC which is an interpreted computer language and second is, for example, Perl, which is a command language. The two are quite different but I'm not going to get into that. It's easy to find on the internet.

jedishrfu said:
One word of clarification on the history of markup is that while HTML is considered to be the first markup language it was in fact adapted from the SGML(1981-1986) standard of Charles Goldfarb by Sir Tim Berners-Lee:
NUTS, again. Yes, you are correct. I actually found that all out AFTER I had done the "final" edit and just could not stand the thought of looking at the article for the 800th time so I left it in. I'll make a correction. Thanks.
 
  • #25
stevendaryl said:
Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.

You can read the whole Wikipedia articles on "scripting language" and "interpreted language" but this does not really provide a clear answer to your question. In fact, I think there is no definition that would clearly separate scripting from non-scripting languages or interpreted from non-interpreted languages. Here are a couple of quotes from Wikipedia.

"A scripting or script language is a programming language that supports scripts; programs written for a special run-time environment that automate the execution of tasks that could alternatively be executed one-by-one by a human operator."

"The terms interpreted language and compiled language are not well defined because, in theory, any programming language can be either interpreted or compiled."

Speaking of scripting languages, consider Lua, which is the most widely used scripting language for game development. Within a development team, some programmers may only need to work at the Lua script level, without ever needing to modify and recompile the core engine. For example, how a certain game character behaves might be controlled by a Lua script. This sort of scripting could also be made accessible to the end users. But Lua is not an interpreted language.

LISP is not a scripting language. On the other hand, LISP is an interpreted language, but it can also be compiled. You might spend most of your development time working in the interpreter, but once some code is nailed down you might compile it for greater speed, or because you are releasing a compiled version for use by others.

Now perhaps someone will jump in and say "LISP can in fact be a scripting language,", etc. I would not respond. ;)
 
  • #26
David Reeves said:
Speaking of scripting languages, consider Lua, which is the most widely used scripting language for game development. Within a development team, some programmers may only need to work at the Lua script level, without ever needing to modify and recompile the core engine. For example, how a certain game character behaves might be controlled by a Lua script. This sort of scripting could also be made accessible to the end users. But Lua is not an interpreted language.

This is a good point. "Scripting" is a pretty loose term.
 
  • #27
stevendaryl said:
Something that I was never clear on was what made a "scripting language" different from an "interpreted language"? I don't see that much difference in principle between Javascript and Python, on the scripting side, and Java, on the interpreted side, other than the fact that the scripting languages tend to be a lot more loosey-goosey about typing.

One key feature is that you can edit and run a script as opposed to say Java where you compile with one command javac and run with another command java. This means you can't use the shell trick of #!/bin/xxx to indicate that its an executable script.

Scripting languages usually can interact with the command shell that you're running them in. They are interpreted and can evaluate expressions that are provided at runtime. The loosey-goosiness is important and makes them more suited to quick programming jobs. The most common usage is to glue applications to the session ie to setup the environment for some application, clear away temp files, make working directories, check that certain resources are present and to then call the application.

https://en.wikipedia.org/wiki/Scripting_language

Java is actually compiled into bytecodes that act as machine code for the JVM. This allows java to span many computing platforms in a write once run anywhere kind of way. Java doesn't interact well with the command shell. Programmers unhappy with Java have developed Groovy which is what java would be if it was a scripting language. Its often used to create domain specific languages (DSLs) and for running snippets of java code to see how it works as most java code runs unaltered in Groovy. It also has some convenience features so that you don't have to provide import statements for the more common java classes.

https://en.wikipedia.org/wiki/Groovy_(programming_language)

Javascript in general works only in web pages. However there's node.js as an example, that can run javascript in a command shell. Node provides a means to write a light weight web application server in javascript on the server-side instead of in Java as a java servlet.

https://en.wikipedia.org/wiki/Node.js
 
  • #28
Another typo: I believe you meant "fourth generation," not "forth generation."
 
  • #29
vela said:
Another typo: I believe you meant "fourth generation," not "forth generation."
Thanks.
 
  • #30
@phinds - Do you give up yet? This kind of scope problem is daunting. You take a generalized tack, people reply with additional detail. Your plight is exactly why I am loath to try an insight article. You are braver, hats off to you!
 
  • Like
Likes phinds and jedishrfu
  • #31
Yes,, don't give up. Remember the Stone Soup story. You ve got the soup in the pot and we're bringing the vegetables and meat.

You've inspired me to write an article too.

Jedi
 
  • Like
Likes phinds
  • #32
jedishrfu said:
Scripting languages usually can interact with the command shell that you're running them in. They are interpreted and can evaluate expressions that are provided at runtime.
jedishrfu said:
Java is actually compiled into bytecodes that act as machine code for the JVM. This allows java to span many computing platforms in a write once run anywhere kind of way.

And of course something like Python has both these traits. But there are plenty of variations for scripting languages.

For example some scripting languages are much less handy, e.g. so far as I know, AppleScript can't interact in a command shell, which makes it harder to code in; neither did older versions of WinBatch, back when I was cobbling together Windows scripts with it around 1999-2000; looking at the product pages for WinBatch today, it still doesn't look like this limitation has been removed. Discovering Python after having learned on WinBatch for several years was like a prison break for me.

And I also like the example given above by @David Reeves of Lua, a non-interpreted, non-interactive scripting language: "Within a development team, some programmers may only need to work at the Lua script level, without ever needing to modify and recompile the core engine. For example, how a certain game character behaves might be controlled by a Lua script. This sort of scripting could also be made accessible to the end users. But Lua is not an interpreted language."
 
  • #33
  • #34
nsaspook said:
Also missing is the language Forth.
To repeat myself:

This was NOT intended as a thoroughly exhaustive discourse. If you look at the wikipedia list of languages you'll see that I left out more than I put in but that was deliberate.
 
  • Like
Likes nsaspook
  • #35
I should add that, regarding Lua, they say it is an interpreted language. "Although we refer to Lua as an interpreted language, Lua always precompiles source code to an intermediate form before running it. (This is not a big deal: Most interpreted languages do the same.) "

This is different from a pure interpreter, which translates the source program line by line while it is being run.

In any case you can read their explanation here.

http://www.lua.org/pil/8.html

BTW it is easy and instructive to write an interpreter for BASIC in C.
 
Last edited by a moderator:

Similar threads

  • Programming and Computer Science
4
Replies
122
Views
13K
  • Programming and Computer Science
2
Replies
62
Views
10K
  • Programming and Computer Science
Replies
10
Views
2K
  • Programming and Computer Science
Replies
2
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
  • Programming and Computer Science
Replies
25
Views
3K
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • Programming and Computer Science
Replies
2
Views
2K
  • Programming and Computer Science
2
Replies
46
Views
4K
Back
Top