Move over Patch Tuesday – it’s Ada Lovelace Day!

Security

The second Tuesday of every month is Microsoft’s regular day for security updates, still known by almost everyone by its unofficial nickname of “Patch Tuesday”.

But the second Tuesday in October is also Ada Lovelace Day, celebrating Ada, Countess of Lovelace.

Ada was a true pioneer not only of computing, but also of computer science, and gave her name to the programming language Ada.

The Ada language, intriguingly, emerged from a US Department of Defense project aimed at “debabelising” the world of governmental coding, where every department semed to favour a different language, or a different language dialect, making it more difficult, more expensive, and less reliable to get them to work together.

Ada had numerous syntactic features aimed at improving readability and avoiding common mistakes. Unlike comments in C, which start with /* and run until the next */, perhaps many lines later, Ada simply ignores anything after -- on any one line, so comments can’t accidentally run on further than you intended. Instead of enclosing all multiline code blocks within squiggly brackets ({...}, also known as braces), Ada has a unique terminator for each sort of multi-line block, e.g. end record, end loop and end if. Ada Lovelace, we suspect, would have applauded the clarity of her namesake language, but Ada-the-language never really caught on, and C’s squiggly bracket syntax has largely won the day, with Python perhaps the only non-squiggly-bracket language in widespread use. Squiggly brackets are a vital aspect of C, C++, C#, Go, Java, JavaScript, Perl, Rust and many other popular languages.

Ada Lovelace’s era

You might be surprised to find, given how strongly Ada’s name is associated with the beginnings of computer science, that she lived in the first half of the nineteenth century, long before anything that we currently recognise as a computer, or even a calculator, existed.

(Ada died of uterine cancer in 1852 at just 36 years old.)

But although computers in their modern sense didn’t exist in the 1800s, they very nearly did.

Here’s how it almost happened.

Charles Babbage, in the early 1800s, famously devised a mechanical calculating device called the Difference Engine that could, in theory at least, automatically solve polynomial equations in the sixth degree, e.g. by finding values for X that would satisfy:

aX6 + bX5 +cX4 +dX3 +eX2 + fX + g = 0

The UK government was interested, because a device of this sort could be used for creating accurate mathematical tables, such as square roots, logarithms and trigonometric ratios.

And any machine good at trigonometric calculations would also be handy for computing things like gunnery tables that could revolutionise the accuracy of artillery at land and sea.

But Babbage had two problems.

Firstly, he could never quite reach the engineering precision needed to get the Difference Engine to work properly, because it involved sufficiently many interlocking gears that backlash (tiny but cumulative inaccuracies leading to “sloppiness” in the mechanism) would lock it up.

Secondly, he seems to have lost interest in the Difference Engine when he realised it was a dead end – in modern terms, you can think of it as a pocket calculator, but not as a tablet computer or a laptop.

So Babbage leapt ahead with the design of a yet more complex device that he dubbed the Analytical Engine, which could work out much more general scientific problems than one sort of polynomial equation.

Perhaps unsurprisingly, if regrettably in hindsight. the government wasn’t terribly interested in funding Babbage’s more advanced project.

Given that he hadn’t managed to build the mechanism needed for a much simpler equation solver, what chance did a giant, steam-powered, general-purpose computer have of ever delivering any useful results?

The European conference circuit

In a curious twist of international, multilingual co-operation, Babbage travelled to Italy to give a lecture promoting his Analytical Engine.

In the audience was a military engineer named Captain Luigi Menabrea, who was thus inspired to co-operate with Babbage to produce an 1842 paper that described the machine.

Although he was Italian, Menabrea published his paper in French…

…and it was Ada Lovelace who then translated Menabrea’s paper into English.

At Babbage’s urging, Ada also added a series of Notes by the Translator, which turned out not only to be more than twice as long as Menabrea’s original report, but also more insighful, explaining several important characteristics of what we would now call a general-purpose computer.

Walter Isaacson, in his excellently readable book The Innovators, published in 2014, describes how Ada “explored four concepts that would have historial resonance a century later when the computer was finally born”:

  • Ada recognised that the Analytical Engine, unlike the Difference Engine, was truly a general-purpose device, because it could not only be programmed to do one thing, but also, and comparatively easily, be reprogrammed to perform some completely different task.

In Ada’s own words (this was an age in which scientific literature still had rather more in touch with literature than perhaps it does today):

The Difference Engine can in reality (as has been already partly explained) do nothing but add; and any other processes, not excepting those of simple subtraction, multiplication and division, can be performed by it only just to that extent in which it is possible, by judicious mathematical arrangement and artifices, to reduce them to a series of additions. The method of differences is, in fact, a method of additions; and as it includes within its means a larger number of results attainable by addition simply, than any other mathematical principle, it was very appropriately selected as the basis on which to construct an Adding Machine, so as to give to the powers of such a machine the widest possible range. The Analytical Engine, on the contrary, can either add, subtract, multiply or divide with equal facility; and performs each of these four operations in a direct manner, without the aid of any of the other three. This one fact implies everything; and it is scarcely necessary to point out, for instance, that while the Difference Engine can merely tabulate, and is incapable of developing, the Analytical Engine can either tabulate or develope.

  • Ada realised that the Analytical Engine was not limited to encoding and computing with numbers. Although digital, and based on an ability to perform numerical calculations, these digital operations, she explained, could in theory represent logical propositions (as we take for granted today in if ... then ... else ... end if statements), musical notes, and so on.

As Ada put it:

[The Analytical Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine. Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent. he Analytical Engine is an embodying of the science of operations, constructed with peculiar reference to abstract number as the subject of those operations.

  • Ada came up with the concept of reusing parts of what we now call programs. In this sense, she can be said to have invented the concept of the subroutine, including recursive subroutines (functions that simplify the solution by breaking a calculation into a series of similar subcalculations, and then calling themselves).
  • Ada first usefully addressed the question “Can machines think?” This is an issue that has worried us ever since.

The Frankenstein connection

Ada’s father (though she never met him) was the infamous poet Lord Byron, who memorably spent a rainy holiday in Switzerland writing horror stories with his literary chums Percy and Mary Shelley.

Byron’s and Percy Shelley’s efforts in this friendly writing competition are entirely forgotten today, but Mary Shelley’s seminal novel Frankenstein; or, The Modern Prometheus (published in 1818) is popular and well-respected to this day.

The Frankenstein story famously explored the moral dilemmas surrounding what we might today refer to as artificial intelligence. (Frankenstein, don’t forget, was the scientist who conducted the experiment, not the AI that emerged from the project.)

Ada, however, didn’t seem to share her father’s friend’s dystopian concerns about Analytical Engines, or indeed about computers in general.

She offered the opinion, in the final section of her Notes by the Translator, that:

The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths. Its province is to assist us in making available what we are already acquainted with. This it is calculated to effect primarily and chiefly of course, through its executive faculties; but it is likely to exert an indirect and reciprocal influence on science itself in another manner. For, in so distributing and combining the truths and the formulæ of analysis, that they may become most easily and rapidly amenable to the mechanical combinations of the engine, the relations and the nature of many subjects in that science are necessarily thrown into new lights, and more profoundly investigated. This is a decidedly indirect, and a somewhat speculative, consequence of such an invention.

Just over 100 years later, when Alan Turing famously revisited the issue of artificial intelligence in his own paper Computing Machinery and Intelligence, and introduced his now-famous Turing Test, he dubbed this Lady Lovelace’s Objection.

What to do?

Next time you find yourself writing code such as…

   -- A funky thing: the Ackermann function.
   -- Computable, but not primitive recursive!
   -- (You can't write it with plain old for
   -- loops, yet you can be sure it will finish,
   -- even if it takes a loooooooong time.)

   local ack = function(m,n) 
      if m == 0 then return n+1 end
      if n == 0 then return ack(m-1,1) end
      return ack(m-1,ack(m,n-1))
   end

…remember that recursive subroutines of this sort all started in the scientific imagination of someone who knew what a computer should look like, and what it probably would look like, but yet lived (and sadly died very young) 100 years before any such device ever existed for her to hack on for real.

Hacking on actual computers is one thing, but hacking purposefully on imaginary computers is, these days, something we can only imagine.

Happy Ada Lovelace Day!


Products You May Like

Articles You May Like

THN Recap: Top Cybersecurity Threats, Tools, and Practices (Nov 04 – Nov 10)
Massive Telecom Hack Exposes US Officials to Chinese Espionage
Google Warns of Rising Cloaking Scams, AI-Driven Fraud, and Crypto Schemes
EU Ramps Up Cyber Resilience with Major Crisis Simulation Exercise
Life on a crooked RedLine: Analyzing the infamous infostealer’s backend

Leave a Reply

Your email address will not be published. Required fields are marked *