Recently I've been studying a lot of computer languages -- too many languages. I couldn't really count them.
Since my own beginnings around 1970 with IBM360 BAL (Basic Assembly Language) and JCL (Job Control Language) I've come across many dozens of computer languages. Each is compounded by the various processor enhancements or special hardware implementations, with IBM370, DEC-11/x, Zylog Z8x, Intel x86 and so many more. Nobody can feel comfortable forever just knowing assembler languages, though, regardless of how many.

Even if there was only a single CPU to learn about, there would have to be some kind of JCL, however crude, to run the sequence of loading and executing its symbolic code. Hopefully JCL, at least as IBM conceived it, represents the bottom rung of possible computer languages, which by comparison makes heavenly the cumbersome but self-descriptive COBOL or the artfully clumsy and utilitarian MS-DOS. Dealing with all the other add-ons, GUIs, drivers and other complexities only compounds the difficulty of Operating Systems and their profusion of control languages.
For me, though, C is the actual "machine language" of software. C++ is a cousin, a superset of C. But using C you can create anything, including C and C++ itself, and have a pretty good idea that it will run on just about every computer there is with little or no arm waving. C++ is probably preferred, now that object-oriented is in such vogue these days, but C is the lowest level portable language that one is able to rely on.
There are different flavors and "improvements" on C, but they are usually just another bother for the header files and footnotes of C reference manuals and tutorials. Basically there was a nearly complete version of GNU C many years ago which generated successions until most of the C++ versions on Unix-like systems today are based on GNU. There are other proprietary versions, and certainly there are different implementations that make darn sure that nothing is completely portable.
Object oriented languages are all the rage. On top of Unix and C or C++, with lots of variations on a theme, are the application and scripting languages. JCL was an early and startlingly blunt form of one, but then the various shells and scripts evolved into vary clever languages all their own, such as Python or Ruby. Perhaps someday there will be some common "X-Script", but so far such an Esperanto Mathematico has not yet been ordained. I hope it is not just a mish-mash of Javascript and Html, but the way things in the Internet are going, it might be something like that.
There are profusions of scripts, sometimes only subtly different from one brand to another, but each also having many versions that supersede or intermix with past versions of the same language. One example is the ubiquitous ".BAT" file on MS-Dos, Windows 3.x, 9x, NT, XP and so forth. Although the basic batch file syntax has survived to this day, there is no assurance that specific instances will still function, since they were often tied to strange hardware and software conventions that no longer exist (thankfully -- I got real tired of configuring every little thing in "autoexec.bat" for every new machine that came along...)
I imagine there will always be evolution amongst the several surviving languages, with new architectures demanding new assembler level software and new problems in society demanding different application level software, but usually evolution extends current capabilities a bit further and then a little bit more. C-like syntax might be C-like in the distant future, merely because of historical reasons. But speaking about Unix or XP might become someday be as obsolete as speaking about Edsel or De Soto.
Speaking English may be superseded as well, and just trying to keep up with the normal slang of the local world is hard enough -- there is the slang of the street and the jargon of individual jobs amongst engineers or plumbers, and the never ending names of chemicals such as
dihydrolylbutylphosphataxinolaminaphine.
Still, outside of computers, the total number of human level languages will also increase. We communicate these days by moving our fingers on little plastic keys with nearly the same ease as our parents had by directly talking and gesturing. Images on abstract surfaces have become every bit as real as the tables and desks their displays sit upon. The problem remains a matter of describing actions to an intermediary computer, interpolating one human with others through ever changing collections of machines and languages.
This technique is just a subtler version of cave painting. It is meant to get ideas from one brain into another using some symbolic caricaturization. If we don't have to carve stones or wood, or scribble on paper, or tap on keys, it will be some other device that allows us to symbolize our perceptions.
It is said that if a monkey is allowed to type on a Unix machine, then the computer will do
something. It is not so much a feat by the monkey as it is a comment on the unreadable acronyms and abbreviated verbiage that Unix shells use to symbolically invoke the shortcuts of shortcuts of shortcuts that Unix programmers have written over the years. Even entire Unix boxes become mere shortcuts in a network of shortcuts that we use in LANs or WANs, and now WLANs -- whole rooms full of machines now effectively existing as symbols inside the "pipes" of the Internet.
The Internet is the repository of all human language, now, including (and especially) its own language. The Internet might fail, and lose some of that data, or it might have redundancies that make failure nearly impossible. In today's world there is great redundancy, but the future is not always certain.
The only thing I fear regarding the loss of information from the Internet is if all electricity somehow becomes unavailable for a prolonged period. That would be the end of the data. It would be worse than having only rocks left over from an ancient culture. Even books are more certain to retain information in the far future, because any human can look at the scribbles in a book, even in a crumbling book -- but they usually can't look at the bits on a disk without a highly complex, electronic tool.
It would be nice, but I don't really need to have all books ever written etched onto the surface of a hundred trillion tungsten atoms folded up inside of a grain of dust. Just something the size of a normal bookcase would be fine. That might not be as convenient for backing up your computer, but it provides some probability that humans will make sense of it in some future time after all the machines are dead.
There are already millions of books gathering dust, perhaps even cities full of buildings full of books gathering dust. The children have gone on to other things. No one needs the "Edsel Handbook" anymore. No one needs "Windows for Workgroups for Dummies" anymore. They may exist for the future like old wax cylinders of RCA Victrolas, or like the cubic miles of National Geographic magazines and millions of coffee coasters made from AOL CDs -- just so much ballast.
Languages have to grow, and they also must decay. A person will never wish to state some sequence of binary digits like "100011011010111011011101111..." to a machine in order to invoke some action like reading a certain document. That bunch of bits would be packaged as somewhat human words, like "Read FILE A" or "Do This or That with File X". But even those bunches of little words might be too repetitious, so they become
slangified to the phrase "Duh" pronounced in special ways, with all the right functions somehow getting invoked.
We tend to drop the "Son of" parts in our names that way, like we have "John" and "John's son" but not too much beyond "John's son's son". We drop those words out over time so that it may be that no son has been named "John" for quite a few generations, but they are still all named "Johnson".
It is likely that some kind of high level language becomes standardized at the chip level of computer architecture. It has already occurred to a great deal already, since all the possible binary mechanisms CPUs are capable of have been described to the ultimate extent. It is unlikely that a decimal point of Pi to a trillion digits will really help a farmer's cows make more milk. Perhaps it might help navigating near the super-massive black hole at the galactic center -- someday when visiting places like that is not a fantasy.
The practical infinity of computers is very near. Already the Internet is composed of so much information that it is quite impossible for a single human to ever experience it all, nor even for a mass of high speed corporate server farms to read it all. There are as many paths through the massive, ever growing databases as there are humans, and probably even far more.
If a machine ever becomes intelligent, or especially if it can affect curiosity, then all that redundant storage of human trivia may be of great use. Imagine if we could read the actual words of "Gods" that "programmed" the evolution of humans. Some people believe we do have such words, but I think all of them are the words of men, not "Gods". But machines would have the painstakingly detailed blueprints of how their "DNA" was constructed.
We can almost read our own DNA now, but we need such powerful computers to do all the complicated bookkeeping that we can never intimately know the DNA itself, only occasional groupings that code for certain proteins or precursor building blocks of our bodies and minds. Knowing each and every mutation that led to each major step in our evolution is not so obvious, but we do have clues in the DNA of closely related species.
Then there is also the "DNA of reality" -- the underlying 10 or 11 dimensional strings (or whatever) that wiggle around on the actual boundary between nothingness and somethingness. The machines might someday have greater minds than ours, at least in sheer massiveness, but they will always trip over the same cow pies in the universe as any other intelligence. There are logical loopholes like Klein bottles, and optical illusions and ambiguities that no amount of intelligence can eliminate.
But machines will have the history of Mankind to use, all of our legends and fables to use as warnings. It will be the legacy of what a bunch of monkeys with digital bit boxes could make from the information we were able to know using languages that we were able to create. Mankind had no such reference manual. We just stumbled into our existence -- somehow.