When did Kristen get

Cher: Problem I Have With All the Boyfriends Recently Is6. Thomas Flowers (1905-1998)

With a few exceptions early digital computing devices were electromechanical and were made of small electrically driven mechanical switches called relays. “They operated slowly while the basic components of an electronic computer” originally vacuum tubes “did not have moving components saving electrons and running extremely fast. The invention of digital high-speed techniques using vacuum tubes made the modern computer possible, and engineer Thomas Flowers made the earliest systematic use of vacuum tubes for digital data processing. His idea was that electronic equipment should replace existing relay-built systems, and noted that, at the outbreak of the war with Germany in 1939, he might have been the only person in Britain to know that high-speed optical computing vacuum tubes could be used on a large scale. The first fully functioning digital electronic computer was the Colossus from February 1944 used by cryptanalysts at Bletchley Park. The Government Code and Cypher School successfully deciphered German radio communications encoded with the Enigma machine from the beginning of the war and decoded about 39,000 intercepted messages per month by 1942. Messages encoded by a different means began to be intercepted during the second half of 1941, and the new cipher machine split in April 1942. The need to decode sensitive intelligence as quickly as possible prompted Max Newman in November 1942 to suggest automating key parts of the decryption process with high-speed electronic counting machines. The first machine built to his design was electronic circuit relay-based but was unstable and slow. Instead, Flowers proposed creating an all-electronic system, and building the world’s first large-scale programmable interactive electronic computer. In January 1943, Colossus I was shipped to Bletchley Park. While it lacked two important features of modern computers (it had no internally stored programs and was not a general purpose machine) to those familiar with the universal Turing machine and the related stored program principle Flowers s equipment was proof of the feasibility of using large numbers of vacuum tubes to implement a general-purpose, high-speed computer stored-program. 8/8.928 Source:

7. Cacm.acm.org C.F. Williams (1911-77), 8. Tom Kilburn

F. (1921-2001) In Max Newman’s Computing Machine Laboratory at Manchester University, C. Williams and Tom Kilburn designed the first general-purpose integrated electronic digital computer system. The Manchester Baby “as it became known in June 1948 conducted its first measurement. The program stored on a cathode ray tube’s face was only 17 instructions long and a much-enlarged computer version with a Turing-designed programming method. It became the world’s first commercially available computer, the Ferranti Mark I. In February 1951, the first computer to be completed was installed at the University of Manchester and a total of about 10 were distributed in Holland and Italy, in Britain. 8/8.929 8/8.929 Source:

9 keystone / Getty Images. J. Eckert (1919-1995) 10. 11. John Mauchly (1907–1980). John von Neumann (1903–1957) and 12 programmers from ENIAC. 13. Kay McNulty (1921–2006). Betty Snyder (2001-1917) 14. 15. Marlyn Wescoff (1992–2008) 16. Ruth Lichterman (1924-86). Betty Jean Jennings (1924-2011) and 17 respectively.

John von Neumann joined the ENIAC party in 1944. He stressed the importance of the stored program principle at the Moore School including the possibility of allowing the computer to change its own software in useful ways while operating. Because von Neumann was a renowned figure who made the idea of a high-speed stored digital computer system widely known, it became common “albeit historically misleading” to refer to electronic stored-program digital computers as von Neumann machines. “As Fortune states when the ENIAC was being designed at Penn in 1945, it was assumed that it would operate a specific set of computers. But the advent of the end of the war meant the machine was required for many other forms of calculations involving the weather patterns of sonic waves and the explosive power of atom bombs that would allow it to be reprogrammed constantly. Six women developed the ENIAC learning to program in 1946 without programming languages or resources, because so far there was none. To help them, they only had logical diagrams. They showed that a computer’s programming would become just as critical as its hardware design and construction. We understood the technology as well as the computer and were able to diagnose issues as well as (if not better than) the engineers who originally thought the hardware assembly was the most critical part of the project and thus a man’s work. 8/8.930 Source: 18 Agnesscott.edu. Grace Murray Hopper (1906-1992)

Hopper joined WAVES or Women Accepted for Volunteer Emergency Service as part of the U.S. as PBS reports. Naval Reserve became Lieutenant Hopper in 1943 and a year later. She was assigned to Harvard’s Bureau of Ships Computation team, which built a machine to make simple, difficult calculations for tasks such as laying mine fields. Howard Aiken directed the research which ultimately included the development of the first programmable digital computer called the Mark I. Hopper coined the words “bug” and debug “after extracting a moth from the system while searching for the cause of a computer failure as they relate to computer errors and how to repair them. She joined a company of Eckert and ENIAC fame founded by Mauchly in 1949. They created a product called Univac a computer which recorded high speed magnetic tape information. This reflected a major innovation over today’s standard punch cards. The company was acquired by Sperry Rand and Hopper continued to make significant progress in eliminating errors by creating a program that would convert programmers ‘ code into machine language. The first programming language to use English words was created by Flow-matic and her team. The computer programming language that made computation a business world tool not just the science world was later incorporated into COBOL. Hopper also led an attempt to standardize COBOL and persuade the entire Navy to use the language of programming. She was a proponent of code standardization and reliability and under her guidance the Navy developed a series of programs to test COBOL compilers. The validation idea had a broad influence on other languages and organisations, and eventually led to standards and validation facilities for most programming languages. Together with Atari’s Pong, which had more advanced electronics and sound, Odyssey moved gaming into a more technical domain faster and Baer noted that if it were t for the enthusiastic audience of video game fans then high processing speeds and complex computer graphics would only be found in the business and science world. 8/8.932 8/8.932 Source:

20 nap.edu. Edgar F. Codd (1923-2003) As the A.M. Turing Award says Edgar F. Codd created an innovation of the relational data model that spurred a tens of billions of dollars worth of a database industry. He led the team at IBM in the late 1950s that developed the first multi-programming system in the world. Multiprogramming “refers to the ability of programs to run simultaneously, which are developed independently of each other. When one program is waiting for another program to occur, the central processing unit of the machine can be used. Multiprogamming is now common on almost every computer system. He focused on high-level software specification methods and then turned his attention to issues concerning the database. Though there were many database items at the time they were hard to use and required highly specialized technical skills. They also lacked a strong theoretical foundation and Codd who understood the need for such a foundation established one by inventing the data relational model often remembered as one of the 20th century’s greatest technological accomplishments. The relational model offers a m
ethod of structuring data using mathematical relationships or grid-like structures that are constructed from columns and rows. The physical representation of a relation in a database is popularly referred to as a table and all data must be contained in tables under the relational model. The relational model offered a theoretical framework within which to tackle a number of database problems. Essentially all the systems that are used today are based on Codd’s theories. 8/8.933 Source: 21.Amturing.acm.org. John Warner Backus (1924-2007)

John Warner Backus directed the first popular high-level programming language to be developed fortran short for Formula Translation. The Washington Post states that carefully hand-coded machines had to be programmed in raw strings of digits that would trigger behavior within the system before Fortran. Fortran was a high-level language that abstracted the task of allowing programmers to enter commands through a more logical framework. The computer could then autonomously convert the programmer’s input into machine code. Fortran reduced by a factor of 20 the number of programming statements needed to operate a computer and showed critics that computers could run efficiently without hand-coding. The programming languages and applications proliferated afterwards and Fortran is still in use today. Backus also developed a method for defining the syntax of Backus-Naur F programming languages24. 24. John McCarthy (1927-2011)

John McCarthy was a pioneering figure in the field of artificial intelligence as Stanford notes. He coined the term artificial intelligence “and spent his career developing the field for the next five decades. In 1958 McCarthy invented the second-oldest programming language after Fortran, the computer programming language LISP. LISP is still in use today, and is the language of choice for artificial intelligence programming. Also in the late 1950s and early 1960s he developed the concept of machine time-sharing. Innovation significantly improved distributed computing performance though it predated the cloud computing era by decades. In a 1960 paper, McCarthy presented the principles of his programming theory and described a method to develop human order intelligence. “In 1966, he drew attention by hosting a series of four simultaneous computer chess matches held in Russia via telegraph against rivals. The matches were played with two pieces per side and took several months to complete. Later, McCarthy would refer to chess and other board games as the Artificial Intelligence Drosophila referring to the fruit flies which proved important in early genetic studies. He later developed the first hand-eye computer system where the user could see actual 3D blocks via a video camera and control a robotic arm for stacking and arranging the blocks to complete exercises. McCarthy co-founded the Artificial Intelligence Program at MIT and what became the Artificial Intelligence Institute at Stanford. 8/8.937 Source:

25 Alumnae.mtholyoke.edu. Jean E. Sammet (1928-)

Jean E. Sammet managed Sperry Gyroscope Co.’s first experimental programming department, and joined IBM in 1961 to coordinate and oversee the Boston Program Center. She developed the definition according to the IEEE Computer Society, and guided the creation of the first FORMAC or FORmula MAnipulation Compiler. 8/8.938 8/8.938 Source: Intel.com blogs

26. Gordon E. Moore (1929-). The invention of the integrated circuit or microchip itself is attributed to Robert N. Noyce (1927-1990)

Noyce along with Jack Kilby. Noyce applied for US in July 1959. Patent 2981877 “Half Conductor System and Lead Structure” an integrated circuit type. His self-employed effort was documented only a few months after inventor Jack Kilby’s key findings. Although six months earlier Kilby’s invention had not declined the title of co-inventor either. 8/8.939 8/8.939 Source:

28 computinghistory.org. Philip Don Estridge (1937-1985) Don Estridge led the development of the first IBM personal computer and is regarded as the founder of the IBM PC. It was under Estridge’s guidance, according to The New York Times, that a small team of IBM employees began working on IBM’s first microcomputer in 1980. No one in the company at the time had any idea how the project would revolutionize the computer industry by installing millions of small computers around the world on office desktops and kitchen tables. Under Estridge, the engineers had come from the world of massive machines and his main challenge was to get them to find out how a non-specialist could use an IBM system easily. We found that the way people responded to a machine emotionally was almost more important than what we actually did with it. Estridge and his team developed the prototype of a small office computer which was soon called the PC in just four months. Within a year the Macintosh was on retail shelves and the PC overtook the Apple II as the best-selling personal computer by late 1983. At IBM, the team broke a lot of rules and Estridge was given the authority to take whatever decisions were required to get the company into the business of personal computers quickly. He rejected components made by IBM, and instead selected cheap off – the-shelf components from other vendors. He made the design specifications of the device public, allowing thousands of people to write software programs. Several of these programmers developed multi-million-dollar companies and IBM revenues were guided by the availability of a wide range of application programs. 8/8.940 8/8.940 Source: 8/8.941 Paw.princeton.edu Source:

31 Raytheon.com. Ray Tomlinson (1941-)

According to The Verge in 1971, Ray Tomlinson was a recent MIT graduate hired to help develop the early components of the Agency Network for Advanced Research Projects (ARPANET), the Internet predecessor. He vowed to create a networked messaging system on his own. Most computers at the time allowed users to send messages to each other but there was little reason to send messages across computers because so few computers were networked. To show networked communications, Tomlinson’s solution used the now-ubiquitous @ “symbol. Only later reflection on the ARPANET’s 25th anniversary came to realize the invention was significant. The idea was of organic origin and many programmers started to work on it as people latched onto the idea of leaving messages on the machine for each other. 8/8.942 8/8.942 Source:

32 Computerhistory.org. Ken Thompson (1943-). Dennis Ritchie (1941-2011) The Computer History Museum reports that Ken Thompson and Dennis Ritchie developed the UNIX operating system in 1969 and needed a higher-level language to allow them greater control over the operating system details. When Ritchie UNIX rewrote it in the C programming language it became a truly portable operating system which could run on a wide range of different hardware platforms. The C language itself has been widely adopted and is now commonly used. UNIX has become the cornerstone of modern world technical infrastructure and UNIX or one of its many variants runs on devices ranging from supercomputers to smartphones. Nearly everything on the internet uses both C and UNIX. Even Windows was once written in C and Mac OS X and iOS are underpinned by Unix. 8/8.943 8/8.943 Source: Nwrwic.org

In the late 1960s and 1970s, when Radia Perlman attended MIT, she was one of only a few hundred women of a class of 1,000. The Atlantic states that it has become a leader in the field of computer science designing the technology behind the Spanning Tree Technology (STP) that has made the Internet possible today. Spanning Tree is a network protocol that provides a loop-free topology for any local area network bridged to Ethernet. The basic function of STP is to prevent bridge loops and the resulting radiation from the transmission. It allows redundant links to be included in a network design which provide automatic backup paths if the active link fail. Perlman has contributed significantly to the fields of network design and standardization such as link-state protocols. She
invented TRILL to fix Spanning Tree’s deficiencies and pioneered the method of teaching computer programming to young children by creating TORTIS a variant of the LOGO educational robotics languages. Perlman is often referred to as the internet’s mother, but she told The Atlantic she didn’t like the title because an person invented the internet. She admits that she made some important improvements to the underlying infrastructure, but no particular technology really caused the Internet to succeed. “The success of the Internet was not due to the specific technologies, but rather due to the variety of ways in which the Internet was used. 8/8.944 …… 35. Richard Stallman (1953-)

From 1971 to 1984, Richard Stallman worked at MIT’s Artificial Intelligence Lab to learn how to build operating systems by writing the first extensible Emacs text editor in 1976 and improving the AI technique of dependency-directed backtracking, also known as truth maintenance. Stallman launched the project to develop the GNU operating system, a Unix-like operating system intended to be completely free software, according to his website in 1983. He also initiated the free software movement with that declaration, and founded the Free Software Foundation in 1985. The GNU / Linux framework that is a version of GNU and also uses the Linux kernel created by Linus Torvalds is used in tens of millions or even hundreds of millions of people. Yet retailers often include non-free software in those schemes and Stallman has been fighting for free software and lobbying against both software patents and the harmful extension of copyright laws since the 1990s. (Correction 2/17/15: adjusted set of computers running GNU / Linux.)24. 24. John McCarthy (1927-2011)

John McCarthy was a pioneering figure in the field of artificial intelligence as Stanford notes. He coined the term artificial intelligence “and spent his career developing the field for the next five decades. In 1958 McCarthy invented the second-oldest programming language after Fortran, the computer programming language LISP. LISP is still in use today, and is the language of choice for artificial intelligence programming. Also in the late 1950s and early 1960s he developed the concept of machine time-sharing. Innovation significantly improved distributed computing performance though it predated the cloud computing era by decades. In a 1960 paper, McCarthy presented the principles of his programming theory and described a method to develop human order intelligence. “In 1966, he drew attention by hosting a series of four simultaneous computer chess matches held in Russia via telegraph against rivals. The matches were played with two pieces per side and took several months to complete. Later, McCarthy would refer to chess and other board games as the Artificial Intelligence Drosophila referring to the fruit flies which proved important in early genetic studies. He later developed the first hand-eye computer system where the user could see actual 3D blocks via a video camera and control a robotic arm for stacking and arranging the blocks to complete exercises. McCarthy co-founded the Artificial Intelligence Program at MIT and what became the Artificial Intelligence Institute at Stanford. 8/8.937 Source:

25 Alumnae.mtholyoke.edu. Jean E. Sammet (1928-)

Jean E. Sammet managed Sperry Gyroscope Co.’s first experimental programming department, and joined IBM in 1961 to coordinate and oversee the Boston Program Center. She developed the definition according to the IEEE Computer Society, and guided the creation of the first FORMAC or FORmula MAnipulation Compiler. 8/8.938 8/8.938 Source: Intel.com blogs

26. Gordon E. Moore (1929-). The invention of the integrated circuit or microchip itself is attributed to Robert N. Noyce (1927-1990)

Noyce along with Jack Kilby. Noyce applied for US in July 1959. Patent 2981877 “Half Conductor System and Lead Structure” an integrated circuit type. His self-employed effort was documented only a few months after inventor Jack Kilby’s key findings. Although six months earlier Kilby’s invention had not declined the title of co-inventor either. 8/8.939 8/8.939 Source:

28 computinghistory.org. Philip Don Estridge (1937-1985) Don Estridge led the development of the first IBM personal computer and is regarded as the founder of the IBM PC. It was under Estridge’s guidance, according to The New York Times, that a small team of IBM employees began working on IBM’s first microcomputer in 1980. No one in the company at the time had any idea how the project would revolutionize the computer industry by installing millions of small computers around the world on office desktops and kitchen tables. Under Estridge, the engineers had come from the world of massive machines and his main challenge was to get them to find out how a non-specialist could use an IBM system easily. We found that the way people responded to a machine emotionally was almost more important than what we actually did with it. Estridge and his team developed the prototype of a small office computer which was soon called the PC in just four months. Within a year the Macintosh was on retail shelves and the PC overtook the Apple II as the best-selling personal computer by late 1983. At IBM, the team broke a lot of rules and Estridge was given the authority to take whatever decisions were required to get the company into the business of personal computers quickly. He rejected components made by IBM, and instead selected cheap off – the-shelf components from other vendors. He made the design specifications of the device public, allowing thousands of people to write software programs. Several of these programmers developed multi-million-dollar companies and IBM revenues were guided by the availability of a wide range of application programs. 8/8.940 8/8.940 Source: 8/8.941 Paw.princeton.edu Source:

31 Raytheon.com. Ray Tomlinson (1941-)

According to The Verge in 1971, Ray Tomlinson was a recent MIT graduate hired to help develop the early components of the Agency Network for Advanced Research Projects (ARPANET), the Internet predecessor. He vowed to create a networked messaging system on his own. Most computers at the time allowed users to send messages to each other but there was little reason to send messages across computers because so few computers were networked. To show networked communications, Tomlinson’s solution used the now-ubiquitous @ “symbol. Only later reflection on the ARPANET’s 25th anniversary came to realize the invention was significant. The idea was of organic origin and many programmers started to work on it as people latched onto the idea of leaving messages on the machine for each other. 8/8.942 8/8.942 Source:

32 Computerhistory.org. Ken Thompson (1943-). Dennis Ritchie (1941-2011) The Computer History Museum reports that Ken Thompson and Dennis Ritchie developed the UNIX operating system in 1969 and needed a higher-level language to allow them greater control over the operating system details. When Ritchie UNIX rewrote it in the C programming language it became a truly portable operating system which could run on a wide range of different hardware platforms. The C language itself has been widely adopted and is now commonly used. UNIX has become the cornerstone of modern world technical infrastructure and UNIX or one of its many variants runs on devices ranging from supercomputers to smartphones. Nearly everything on the internet uses both C and UNIX. Even Windows was once written in C and Mac OS X and iOS are underpinned by Unix. 8/8.943 8/8.943 Source: Nwrwic.org

In the late 1960s and 1970s, when Radia Perlman attended MIT, she was one of only a few hundred women of a class of 1,000. The Atlantic states that it has become a leader in the field of computer science designing the technology behind the Spanning Tree Technology (STP) that has made the Internet possible today. Spanning Tree is a network protocol that provides a loop-free topology for any local area network bridged to Ethernet. The basic function of STP is to prevent bridge loops and the resulting rad
iation from the transmission. It allows redundant links to be included in a network design which provide automatic backup paths if the active link fail. Perlman has contributed significantly to the fields of network design and standardization such as link-state protocols. She invented TRILL to fix Spanning Tree’s deficiencies and pioneered the method of teaching computer programming to young children by creating TORTIS a variant of the LOGO educational robotics languages. Perlman is often referred to as the internet’s mother, but she told The Atlantic she didn’t like the title because an person invented the internet. She admits that she made some important improvements to the underlying infrastructure, but no particular technology really caused the Internet to succeed. “The success of the Internet was not due to the specific technologies, but rather due to the variety of ways in which the Internet was used. 8/8.944 …… 35. Richard Stallman (1953-)

From 1971 to 1984, Richard Stallman worked at MIT’s Artificial Intelligence Lab to learn how to build operating systems by writing the first extensible Emacs text editor in 1976 and improving the AI technique of dependency-directed backtracking, also known as truth maintenance. Stallman launched the project to develop the GNU operating system, a Unix-like operating system intended to be completely free software, according to his website in 1983. He also initiated the free software movement with that declaration, and founded the Free Software Foundation in 1985. The GNU / Linux framework that is a version of GNU and also uses the Linux kernel created by Linus Torvalds is used in tens of millions or even hundreds of millions of people. Yet retailers often include non-free software in those schemes and Stallman has been fighting for free software and lobbying against both software patents and the harmful extension of copyright laws since the 1990s. (Correction 2/17/15: adjusted set of computers running GNU / Linux.)In 1981 Microsoft was invited by Apple to develop Macintosh computer software. Microsoft came to build Windows through this exchange of information. Apple’s program used a joystick to move a graphical interface with text and images projected on the screen. It differed drastically from the MS-DOS text and keyboard system, and Windows built with a graphical interface by Microsoft. In 1985 Microsoft introduced Windows with a framework visually similar to Apple’s. Gates took public for Microsoft in 1986 and became an instant millionaire. Source: Justin Sullivan / Getty Images

37. Steve Jobs (1955-2011) He and Steve Wozniak co-founded Apple Computer in 1976, when Jobs was just 21. They started in the garage of the Jobs family, and are credited with revolutionizing the computer industry by making smaller devices more intuitive and available to the general consumer. According to Biography.com Wozniak designed a series of user-friendly machines and Jobs initially took care of selling them for $666.66 each. Apple Computer became a publicly traded company in 1980 and reached a market value of $1.2 billion by the end of its first trading day. The next several Apple products encountered design flaws and Apple was outstripped by IBM in sales. Apple launched the Macintosh in 1984 which was still not compatible with IBM, and the company began phasing out jobs. In 1985, he left Apple to start NeXT and founded an animation company which later became Pixar and merged with Walt Disney in 2006. But NeXT struggled to sell its advanced operating system to mainstream consumers, and in 1996 Apple finally purchased the company. Jobs then returned to his post as chief executive of Apple and revitalized the business with products such as the iMac. Strong marketing and sleek architecture from Apple started to win customers ‘ favor again. The business launched technologies such as the iPod and iPhone MacBook Air that each had profound impact on the path of modern technology. Jobs passed away in 2011 at the age of 56 after battling pancreatic cancer for almost a decade. 8/8.947 Source: Carl Court / AFP / Getty 38. Tim Berners-Lee (1955-)

Tim Berners-Lee is best known as the founder of the 1989 World Wide Web. As a platform for web technical development, he founded the World Wide Consortium and also founded the Web Foundation and co-founded the Open Data Institute. The web was developed by Berners-Lee while the massive particle physics laboratory near Geneva was at CERN and the first web client and server was published in 1990. According to the World Wide Web Foundation website, he noted that many scientists involved in experiments at aCERN and returning to their laboratories worldwide were willing to exchange data and findings, but found it difficult to do so. He recognized the unrealized potential of millions of internet-connected computers and recorded what would become the World Wide Web by proposing a proposal outlining a series of technologies that would actually make the Internet available. He had listed the three fundamental technologies that remain the backbone of today’s Web by October 1990: HTML URI and HTTP. The features have been improved as the proliferation of Web technology. He also designed the editor / browser of the first web page, and the first web server. By the end of 1990 the first web page was supported, and the online group was joined at 1991 people outside CERN. In 1993 CERN declared that anybody could use the technology. The Network has changed the world since then. Source 8/8.948: Jarno Mela / AFP / GettyImages

39. Linus Torvalds (1969-)

Linus Torvalds developed the Linux kernel and supervised the development of the open source of the widely used Linux operating system. He started using Minix, a Unix-inspired operating system created by Andrew Tannenbaum after buying a personal computer. In 1991 Torvalds started working on a new kernel that would later be called Linux, and the first version was released in 1994 after forming a team of volunteers. He continues to supervise Linux development and is the ultimate authority on what new code is being introduced in the mainstream Linux kernel. 8/8.949 Source:

40 Ralph Orlowski / Getty Images. Post Larry (1973-) and Page 41. Sergey Brin (1973-)

Larry Page and Sergey Brin met at Stanford University where they created a search engine which listed the results according to the popularity of the pages for a research project. They explored how sites linked back to other web pages and realized that it would be a good way to search the internet and help people find pages with more incoming links particularly from reliable websites. We also noticed that the most famous outcome often would be the most useful. Following the mathematical term googol they named the search engine Google which corresponds to number one followed by 100 zeroes. Biography.com’s name reflected their goal to coordinate the massive amount of information that is available on the internet. In 1992, Andreessen hired Eric Bina, a NCSA fellow employee, to help a new browser called Mosaic with the project. It was more graphically sophisticated than other browsers of its period and included significant developments such as the “photo” tag that allowed photos to be included on web pages. Older browsers also allowed photos to be viewed but only as separate files. Mosaic also featured a graphical interface with clickable buttons that made it easy for users to navigate and controls that let them skip through the text. Another of the most innovative features of Mosaic was its hyperlink. Hypertext links in earlier browsers had reference numbers which the user would type in to navigate to the linked document. Hyperlinks allowed users to just click on a link to get a text. Mosaic was released on NCSA s servers in 1993 and tens of thousands of people had downloaded the program within weeks. For Unix and Andreessen the original version was, and Bina assembled a team to develop versions of PC and Mac. The success of the mosaics skyrocketed. Further users meant a larger audience for the site, and larger audiences catalyzed further content creation. Andreessen kne
w that he would take over Mosaic when he graduated from NCSA so he moved to settled in Palo Alto in Silicon Valley and built a team with the goal of creating a product that would transcend the original mosaic. They created Netscape which was available in 1994, and was the chosen browser for most web users within weeks. It included new HTML tags to allow programmers more flexibility and imagination, and 75 per cent of web users used it by 1996. Although Netscape has since lost its supremacy over Microsoft and other later rivals (partly due to the browser wars “and partly due to a changing landscape in which the price structure of Netscape has become its downfall), AOL purchased it in 1999. Andreessen has gone on to many other projects founding businesses and leading giants including Twitter eBay and HP on the board of directors. 8/8.951 …… 43. Mark Zuckerberg (1984-)

From his Harvard dorm room, famously co-founded Facebook with Mark Zuckerberg. Biography.com claims that Divya Narendra and twins Cameron and Tyler Winklevoss, fellow students at Harvard, approached Zuckerberg to collaborate on an idea for a social networking site that would use knowledge from Harvard’s student networks to create a dating site. He agreed to help, but with Dustin Moskovitz Chris Hughes and Eduardo Saverin soon began collaborating on his own social network. They created a platform that allowed users to upload images to create profiles and interact with others. They ran the site from a dorm room originally called The Internet, until June 2004. Zuckerberg dropped out of Harvard after his sophomore year, and transferred to Palo Alto for full-time work on Facebook. Facebook had 1 million users by the end of 2004. By the end of 2005 an investment of $12.7 million from Accel Partners increased Facebook’s user base to over 5.5 million.

Tech Cheat Sheet More:

Is Apple s Next Big Item an Electric Car? Can artificial intelligence take your job for granted?