If Steve Jobs were still alive, he would be celebrating his 60th birthday next week.
One of the founders of Apple and Pixar, Jobs is certainly one of the fathers of the modern world for his development of computer products.
As Walter Isaacson makes clear in his last two books, his biography of Jobs and The Innovators, chronicling the creative minds behind the digital revolution, even a genius like Jobs was only as good as those who came before him and the talented people around him. In the case of Jobs, it was the other Steve - the engineer Steve Wozniak - that actually built the Apple I and Apple II computers. The two men complemented each other, with Wozniak's technical skills able to deliver on the vision Jobs had of a user-friendly personal computer.
Two other important men that shaped the world of the 21st century also turn 60 this year. Bill Gates, one of the founders of Microsoft, was also born in 1955. Like Jobs, Gates is a highly intelligent, incredibly talented visionary inspired by the work of others before him who owes much of his success to the talent of those around him.
Neither Jobs nor Gates invented the computer or the microprocessor or the mouse or software or the Internet but their two companies made computers and electronic devices an instrumental part of our everyday lives and they were rewarded with riches beyond their dreams for their creativity and their shrewd business sense.
As both men aged, they mellowed and matured but they also took more personal credit for their success than they deserved. Like Wozniak to Jobs, Paul Allen deserves significant recognition for helping both Gates and himself become billionaires. Allen did far more than assist Gates. He was an active partner in delivering the goods on those early contracts that made the Microsoft operating system the standard for business and personal computers.
Isaacson recounts in The Innovators how Jobs resented Gates to his dying day for "stealing" Windows from Apple. Gates accurately recalled that Jobs "stole" the idea (and many of the developers) for the visual icon-based desktop from the Xerox research department because the corporate bosses couldn't see how that innovation would help them sell photocopiers.
In contrast to Gates and Jobs, the third Information Age giant that also turns 60 this year is neither a household name nor a billionaire but is as significant as both of them. Tim Berners-Lee didn't create the Internet but he is the inventor of the World Wide Web, the trail navigation system that runs through the land of the Internet. Berners-Lee widely shared his idea, instead of patenting it, because he believed in the collaborative power of shared knowledge and the riches gained from everyone working and benefitting together. Spot the fellow who drew his paycheque from a publicly-funded research organization.
Yet all three men and the computer pioneers who came before them owe their fame and fortune to a woman born 200 years ago this year. Ada King, the Countess of Lovelace, was the only legitimate child of her famous father, the Romantic poet Lord Byron. She is so revered in the history of computer science that Jimmy Wales, one of the founders of Wikipedia, named his youngest daughter Ada and Google devoted a "doodle" to her on its home page on Dec. 10, 2012, the 197th anniversary of her birth.
Lady Lovelace was the one who saw the potential of the Analytical Engine created by the Victorian inventor Charles Babbage. In her now legendary "Notes" on Babbage's work, published in 1843, she envisioned more than just the counting machine Babbage was trying to perfect but a device that could work with all forms of information, including language, sight and sound. In his book, Isaacson praises how Ada fused her father's artistic sensibilities with her mother's passion for mathematics into a "poetical science." She dismissed the idea of a truly thinking machine, what is now known as artificial intelligence, in favor of human creativity working in tandem with powerful analytical machines.
It is Ada's vision, not the dreams of early computer developers or the nightmares of science fiction writers and filmmakers of building sentient machines smarter than us, that has prevailed into this century. The more that both computer scientists and psychologists learn about machines and minds, the further the possibility of artificial intelligence retreats in favour of an ever-growing symbiotic relationship between humanity and technology, exactly as a young lady who died two weeks shy of her 37th birthday foretold.
-- Managing editor Neil Godbout