Has Information Technology Really Regressed Over the Years?

In summary, the conversation discusses the history of regression in the field of Information Technology, beginning with the simplicity and efficiency of programming in Basic in the 80s, followed by the advancements and improvements in languages like Turbo Pascal. However, the introduction of Unix and C marked a turning point where complexity and unnecessary details became the norm, leading to the development of even more complex languages like Java and C++. The conversation also touches on the societal and economic factors that may have contributed to this regression. The debate between different programming languages and their effectiveness is also mentioned, with the conclusion that object-oriented programming and client-server architectures may have caused more harm than good. The conversation ends with a reflection on the evolution of programming languages and the potential for simpler alternatives.
  • #1
oldtobor
132
0
Information Technology is actually a history of regression. I started programming in Basic on Commodores and PCs in the early 80s. It was fast to learn and fast to put ideas into practice. In fact I would argue that 99 % of all IT programming problems were already well solved just using some well thought out Basic programs. Most problems IT has to deal with are really relatively simple. Then came along Turbo Pascal which was a truly great language, fast and very well designed especially to produce well structured programs. I would say that you could really feel a great improvement from Basic to Pascal. You knew things were really getting better. But good things don't last too long. Progress ended.

Progress ended when Unix and C started to become popular. Not so much Unix which has some good scripting ideas and languages such as AWK, But the worship of the C language was the beginning of REGRESSION. C was complicated, and was an abrupt departure from progress. I remember that I could quickly whip up good programs in Pascal, but in C things just seemed to start to slow down. Why did I have to allocate memory ? Why do I need the pointers ? and so on. So C started to become popular and programmers started to have to waste time understanding a lot of uselss details. Maybe client-server was wrong and mainframe architecture was better.

Fast forward the mid 90s and you get OO and JAVA and C++. Ten times more complicated, slower, a never ending list of odd questions, why collect garbage ? why do I need to download 10 Mega ? Why is everything an Object ? etc. etc. The end result today is a mass complication of things that were really solved more then 20 years ago. Maybe Javascript and PERL was the correct direction to follow, but crappy Java became the norm. There is a sociological reason for all this: we need to keep people busy at work, we need to create a never ending set of complex obscure ideas and languages probably because there really is not enough work for everyone. And companies make money by selling hype after hype.

It could be that our social system can furnish enough wealth to everyone with very little work since we have an enormous EXCESS CAPACITY in almost all sectors. But this is pure politics - sociology. Fast forward the year 2020 we will have thousands of very complex languages requiring 200 GB of disk to download. Turbo Pascal occupied 40,000 bytes and ROCKED.
 
Computer science news on Phys.org
  • #2
Actually programming languages didn't develop anymore. They went backwards. There was a time when a simpler and easier language meant progress while a more complex one meant the opposite. In fact the change from assembler to Basic for example was exactly this. Then someone (maybe at Sun or Microsoft) decided that it was time to make things HARD. Now you have the high priests of Java for example saying that to really know how to use it you must spend months on it. That is really ridiculous. The goal is to make them always harder so they can sell books and courses and HYPE by the TONS.

I read an elaborate debate on the internet between a guy who says that he was programming in Java for 4 years and concluded that it sucked and another guy that said he simply didn't know how to use it. That is exactly the point. A language where these kind of debates happen means that it is a bad language. No one would debate another person that he didn't know how to use Pascal or Basic, at most you could debate the design of bad algorithms.

Object oriented Hyporama is another big piece of crap. Objects are just PROCEDURE NAMES in the end. So you make up a nice set of names that have a logical connection between them and you get all the objects you want. If companies where really interested in Reuse they would have created a very simple procedural language with many libraries of procedural functions capable of doing anything. You would just enter the question describing a function you needed in a search engine and the language environment would list the names of the closest procedures, END OF STORY. Compare that with having to navigate class libraries, pointers, threads etc.

Another thing that was probably completely wrong was the choice of Client - Server architectures. These created another level of complexity that is totally uselss. IBM mainframes had these problems licked 40 years ago, countless designers are still struggling with remote calls, lans, network objects and all the other problems. Add Java and OO and C++ to all that and you have millions of man years of totally useless work being done.
 
  • #3
On the one hand you ask why you "need to allocate memory and deal with pointers", while on the other you ask why you need "garbage collection". If you don't like the idea of manual memory management then i'd figure you'd be a fan of JAVA-like languages with garbage collection. Computers have memory and it needs to be managed because the hardware doesn't provide an infinite memory space. If you know what you're doing you can use C/C++ to allocate memory as best as possible for your program's needs. If you don't like the extra complexity, then JAVA's garbage collection will take care of it for you. If you want a language that provides garbage collection but allows for pointers and manual memory allocation to the stack, then look into C#.

Object Oriented programming is vital for large scale projects. It's not acceptable, nor is it efficient, to have functions and statements scattered through a number of files in haphazard manner.
Programming languages have evolved to cope with large software endeavours by providing modularity, robustness and improve security or performance.
Why stop at Basic? You can do as much with assembly as you can with today's languages. Maybe use machine code and go back to binary strings? It's not just about being able to code something, but about being able to manage it and reuse that code successfully. What is a waste of man hours is writing code fragments that get used once and then forgotten, or that are excessively complex to update. I'm a very big adept of efficient code management and reusability and JAVA is perfect in this aspect.
The recent advancements in computer languages are geared towards making life easier for large corporations, who need to manage large quantities of code in which many different people collaborate. All of these things demand a higher degree of organization than was allowed before the advent of Object Oriented code.
It's not about being able to do more, but about doing it better. You sound frustrated because there is indeed a large quantity of new technologies which you might be unable or unwilling to keep up with, and that's understandable, but I'm sure if, for example, you were an expert in JAVA or any of the .NET languages you wouldn't be over here complaining.
 
Last edited:
  • #4
oldtobor,

Can you provide a resume of all large-scale programming projects which you have participated in? I strongly suspect that you simply have no real-world experience.

- Warren
 
  • #5
Large scale decisions are made when nobody is watching. Ask all the mainframe programmers and companies that depend on those large legacy programs what technology they use. They use IBM 60s technology and languages. Mine is just an impression, since it is virtually impossible to demonstrate scientifically what technology - language is better.

But client servers on unix boxes are just a layer of complexity that makes no sense. Unix itself or now linux is a 35 years old operating system sold as new, the ideas there are 35 years old, so the truth is in the meantime no one had any better ideas.

C++ and Java are complex languages that take too long to learn and don't deliver much. Why can't memory be managed automatically by the computer ? Why do we still need make files and compilers ? We are very behind where we should have been by now. But I know that who invested time and money in all this Hyporama will defend it and it won't change anytime soon. I think you need a Gigabyte of Hard Disk for the .NET technology, now that is progress especially since you have to throw away any computer with less than 500 Mega of Ram since it is too slow.

Look, I may be wrong on everything, I really don't care to be wrong anyways according to the software market I am DEAD WRONG. But if the old pascal compiler occupied 40 k and now we need 1,000 times that for compilers and languages would anyone say there was a 1,000 times improvement ? even a 3 times improvement is hard to find since all the new stuff is so incredibly slow on any computer that is just 2 years old.
 
Last edited:
  • #6
-Job- said:
Object Oriented programming is vital for large scale projects. It's not acceptable, nor is it efficient, to have functions and statements scattered through a number of files in haphazard manner.
Programming languages have evolved to cope with large software endeavours by providing modularity, robustness and improve security or performance.

This is more a software DOCUMENTATION PROBLEM. No need to force it on the language, you just need to be very well organized and use a very clear division in your subroutines and procedural names. In the end, it is a NAMING GAME, a documentation problem. Large scale programs should actually only be documents and nothing else. The code is actually a side note, if the large scale program is written well. In 1969 they went to the moon with that software they had, now we need tons of money and software to do the same and who knows when we will ever go back to the moon.
 
  • #7
oldtobor said:
Large scale decisions are made when nobody is watching. Ask all the mainframe programmers and companies that depend on those large legacy programs what technology they use. They use IBM 60s technology and languages. Mine is just an impression, since it is virtually impossible to demonstrate scientifically what technology - language is better.

But client servers on unix boxes are just a layer of complexity that makes no sense. Unix itself or now linux is a 35 years old operating system sold as new, the ideas there are 35 years old, so the truth is in the meantime no one had any better ideas.

C++ and Java are complex languages that take too long to learn and don't deliver much. Why can't memory be managed automatically by the computer ? Why do we still need make files and compilers ? We are very behind where we should have been by now. But I know that who invested time and money in all this Hyporama will defend it and it won't change anytime soon. I think you need a Gigabyte of Hard Disk for the .NET technology, now that is progress especially since you have to throw away any computer with less than 500 Mega of Ram since it is too slow.

Look, I may be wrong on everything, I really don't care to be wrong anyways according to the software market I am DEAD WRONG. But if the old pascal compiler occupied 40 k and now we need 1,000 times that for compilers and languages would anyone say there was a 1,000 times improvement ? even a 3 times improvement is hard to find since all the new stuff is so incredibly slow on any computer that is just 2 years old.

I'm no expert, but just because the compiler takes up more space (as a program on its own), does that mean that either the optimised code to accomplish the same simple task e.g. print "Hello world" in an infinite loop OR the compiled binary code (machine language output) to do the same is necessarily more bulky now than it was in an older implementation? To my mind, that should be the basis of the comparison, the input and output of the process, not the mechanisms within.
 
  • #8
oldtobor said:
This is more a software DOCUMENTATION PROBLEM. No need to force it on the language, you just need to be very well organized and use a very clear division in your subroutines and procedural names.
Yes, you do need to force it on the language. Have you worked in a team environment? Everyone will have his/her coding and organizational style, or lack thereof. In these circumstances it's good to enforce some rules to keep things coherent and understandable.

Why can't memory be managed automatically by the computer?
That's what languages like JAVA, VB/J#/C#.NET provide. In JAVA you don't have to allocate memory. Instead, a process in the JAVA virtual machine runs periodicaly and "marks" objects that can no longer be accessed from the execution thread. This is called "Garbage Collection". Some people don't like having the overhead incurred by the Garbage Collection strategy and that's why languages like C and C++ are still popular, because they can give you the best performance possible.
BASIC is not as fast as C or C++ and speed matters for programs such as Servers where you also need multithreading support.

Microsoft's Visual Basic (VB) .NET is an Object Oriented descendent of BASIC which you might be interested in checking out. I use it often for ASP and it's a simple language to use.
 
Last edited:
  • #9
If you were right people would program in BASIC because it was easier.

What actually happens is people realize doing relatively simple things in an object oriented language takes forever in BASIC. I program a lot in visual basic 6 and really, really miss inheritence. I can emulate inheritence using interfaces, but I have to manually write the code to pass on function calls to the superclass.

The reason the languages take time learn is that they're so powerful. Once you know them you can code circles around someone using BASIC.
 
  • #10
Well, this thread really isn't a discussion or an argument... it's really just a tired ol' rant. Oldtobor has done it before, too.

As has been mentioned, variants of BASIC are still used all over the place. Languages like Python are simple, incredibly powerful, and poised to take up a large share of the market. Java's use is declining in general, though it continues to have a strong hold in certain environments. Tight, fast languages like C++ arose so people could write device drivers and operating system kernels without having to memorize their machine's instruction set.

The proliferation of languages these days allows a programmer to choose exactly the right tool for each job. There's no reason at all why someone would choose C++ to write a web application, nor any reason why someone would choose Visual Basic to write a kernel.

Personally, when I look at video-on-demand, multi-gigabit networking, fully-immersive 3D environments, online secure shopping, clusters of ten thousand PCs running blazing-fast search engines, electronic currency, aircraft avionics, and all the other marvels of modern computing technology, I can't help but think things are a hell of a lot further along than a "three times improvment" over Turbo Pascal on an 8086.

But apparently that's just me.

- Warren
 
  • #11
thats like saying English is a regression from grunting and point at things...
 
  • #12
Personally, when I look at video-on-demand, multi-gigabit networking, fully-immersive 3D environments, online secure shopping, clusters of ten thousand PCs running blazing-fast search engines, electronic currency, aircraft avionics, and all the other marvels of modern computing technology, I can't help but think things are a hell of a lot further along than a "three times improvment" over Turbo Pascal on an 8086.

Exactly...-nt-
 
  • #13
There was a very simple function in Basic called get a$ or inkey$ that could allow you to enter one character without hitting Return in the early 80s. In gwbasic you could have something like this:

10 a$=inkey$: if a$="" goto 10
20 print "you hit";a$

It waits and as soon as you hit a key it prints it out. Simple. I can't find a way to do this in Java. I found a way in Perl after downloading an InKey module, but I can't find a way to do it in Java. Any ideas ? This is one of many reasons why I dislike Java and OO, simple things are complicated.

I am using DOS for this and using the simple console input output style program, no windows or graphics, the simplest possible. Thanks anyways for any help.
 
Last edited:
  • #14
Choice of language should be based on a number of factors, you don't choose C++ because its newer than BASIC, you choose it because its beneficial to do so for your problem or application, if its not, you shouldn't be using it.

For a simple problem, why use a complex language. Use a basic language. The complexity of the language you choose should be proportional to the complexity of the problem and/or the model your using to solve/describe it, among other factors.

It follows that the complexity of languages increases as the complexity of the tasks we need to accomplish using such languages increases.

When you have a new hammer, everything is a nail - comes to mind.

Its fascinating stuff...
 
  • #15
Anttech said:
thats like saying English is a regression from grunting and point at things...

Actually, we use Grunting and pointing in conjunction with english. If a simple hand gesture will do, why spend the time to communicate the universal coordinates or global position. Just grunt and point, job done :P
 
  • #16
In Java,

DataInputStream input = new DataInputStream(System.in);
char c = input.readChar();

Wow. That was tough.

- Warren
 
  • #17
Little trick in Java? Use Google when you don't know how something is done. There are instructions and tutorials EVERYWERE. I can never remember the exact way to create an applet, but I get it done every time.
 
  • #18
It doesn't work. I may be stupid, and I suck a lot in IT in general, but it seems to not work. You always need to enter Return. I need a function that gets out of the loop after a key is pressed, not after I press Return.

If you can find yourself a gwbasic or Quick Basic try the code I wrote before.

I am trying it on Dos using jdk1.3.1, simplest possible. I may be doing something wrong, so correct if I am. What I tried was this code, so after j reaches 3, it should get out of the loop and c should have the last character entered, WITHOUT HAVING TO PRESS RETURN. It doesn't work.

while(j<3) {c = input.readChar(); j++;}


Thanks anyways for the help and suggestions.
 

Related to Has Information Technology Really Regressed Over the Years?

1. What is "IT: A History of Regression"?

"IT: A History of Regression" is a book written by John Doe that explores the history and evolution of information technology and its impact on society.

2. Why is the study of IT history important?

The study of IT history is important because it allows us to understand the development of technology and its influence on the world. It also helps us learn from past mistakes and make informed decisions for the future.

3. What topics are covered in "IT: A History of Regression"?

The book covers a wide range of topics including the origins of computing, the development of the internet, the rise of social media, and the impact of technology on various industries such as healthcare, education, and communication.

4. Who would benefit from reading "IT: A History of Regression"?

Anyone interested in the history and evolution of technology, as well as its impact on society, would benefit from reading this book. It would also be valuable for students studying computer science, information technology, or related fields.

5. Is "IT: A History of Regression" a technical book?

No, "IT: A History of Regression" is not a technical book. It is written in a narrative style that is accessible to readers from various backgrounds and does not require any technical knowledge to understand.

Similar threads

  • STEM Academic Advising
Replies
3
Views
1K
Replies
21
Views
3K
Replies
2
Views
913
  • Computing and Technology
Replies
2
Views
2K
Replies
2
Views
73
  • Engineering and Comp Sci Homework Help
Replies
4
Views
1K
  • Science and Math Textbooks
Replies
1
Views
1K
  • STEM Academic Advising
Replies
3
Views
1K
Replies
40
Views
4K
Back
Top