Followers

Tuesday, November 11, 2014

What every computer science major should know

Bill Gates. Steve Jobs. Most of the big names in today’s technology are famous. But did you know that the history of computation science dates back to 2700 BC? Computers in modern times have moved from room sized huge boxes to cubic desktops to laptop to tablets and smart phones.
Although, the real history of machine assisted human existence has been interesting and inspiring. This week let’s look back at the long history of arguably man’s best friend, the computer.
The evolution of man’s digital era is divided into generations. Each generation is an improvement over the previous in terms of ease of use and user interface. Because, these computers have improved programming language and internal organizational systems they make our lives easier with every generation of computer and steadily improving algorithms.
First in the history of computer science was Abacus. The logic behind it was to help traders to count cows and amphorae by hand in order to gain edge in business. While, the oldest known computing device, Antikythera used in navigation and tracking astronomical data while sailing dates back to 87 BC.
Computer Science took a giant leap in 1843 when a woman wrote the first computer program. An English mathematician Ada Lovelace while working as an assistant to Charles Babbage laid foundation of the theory of programmable computers.
1936 was a key year for computer science. Alan Turing and Alonzo Church independently, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing. But the real computing era began in 1937 with Alan Turing’s revolutionary invention, ‘The Turing Machine!’
In the 1960's, computer science came into its own as a discipline. Operating systems saw major advances. At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed. This was followed by Douglas C. Engelbart’s computer mouse at SRI in 1968.
Until 1971 computer was majorly used for government purposes, military assistance and heavy mathematical calculations. Gradually the idea of using computers for academic purposes started surfacing when Steve Wozniak put the first Apple computer on table.
And computer science took the information super highway in 1990 with www (World Wide Web) along with Tim Berners-Lee and Marc Andreessen’s browser. This was followed with the discovery of C, an influential programming language at the Bell Laboratories by Ken Thompson and Dennis Ritchie, Brian Kernighan and Ritchie.
It is important to know how we came to live in a world where our glasses can tell us what we’re looking at. While entering the era of wearable technology, smart appliances, embedded chips and artificial intelligence, knowing this daily amending expansive field is mandatory. It is becoming challenging to discern what belongs in a modern computer science degree that this inspiring history will help us hit the ground with confidence!