Logo

Essay on History of Computer

Students are often asked to write an essay on History of Computer in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on History of Computer

Early beginnings.

Computers didn’t always look like the laptops or smartphones we use today. The first computer was the abacus, invented in 2400 BC. It used beads to help people calculate.

First Mechanical Computer

In 1822, Charles Babbage, a British mathematician, designed a mechanical computer called the “Difference Engine.” It was supposed to perform mathematical calculations.

The Birth of Modern Computers

The first modern computer was created in the 1930s. It was huge and filled an entire room. These computers used vacuum tubes to process information.

Personal Computers

In the 1970s, companies like Apple and IBM started making personal computers. This made it possible for people to have computers at home.

Remember, computers have come a long way and continue to evolve!

Also check:

  • Paragraph on History of Computer

250 Words Essay on History of Computer

Introduction.

The history of computers is a fascinating journey, tracing back several centuries. It illustrates human ingenuity and evolution from primitive calculators to complex computing systems.

Early Computers

The concept of computing dates back to antiquity. The abacus, developed in 2400 BC, is often considered the earliest computer. In the 19th century, Charles Babbage conceptualized and designed the first mechanical computer, the Analytical Engine, which used punch cards for instructions.

Birth of Modern Computers

The 20th century heralded the era of modern computing. The first programmable computer, the Z3, was built by Konrad Zuse in 1941. However, it was the Electronic Numerical Integrator and Computer (ENIAC), developed in 1946, that truly revolutionized computing with its electronic technology.

Personal Computers and the Internet

The 1970s and 1980s saw the advent of personal computers (PCs). The Apple II, introduced in 1977, and IBM’s PC, launched in 1981, brought computers to the masses. The 1990s marked the birth of the internet, transforming computers into communication devices and information gateways.

Present and Future

Today, computers have become an integral part of our lives, from smartphones to supercomputers. They are now moving towards quantum computing, promising unprecedented computational power.

In summary, the history of computers is a testament to human innovation, evolving from simple counting devices to powerful tools that shape our lives. As we look forward to the future, the potential for further advancements in computing technology is limitless.

500 Words Essay on History of Computer

The dawn of computing.

The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is considered the first general-purpose computer, although it was never built.

The first half of the 20th century saw the development of electro-mechanical computers. The most notable was the Mark I, developed by Howard Aiken at Harvard University in 1944. It was the first machine to automatically execute long computations.

During the same period, the ENIAC (Electronic Numerical Integrator and Computer) was developed by John Mauchly and J. Presper Eckert at the University of Pennsylvania. Completed in 1945, it was the first general-purpose electronic computer. However, it was not programmable in the modern sense.

The Era of Transistors

The late 1940s marked the invention of the transistor, which revolutionized the computer industry. Transistors were faster, smaller, and more reliable than their vacuum tube counterparts. The first transistorized computer was built at the University of Manchester in 1953.

The 1950s and 1960s saw the development of mainframe computers, like IBM’s 700 series, which dominated the computing world for the next two decades. These machines were large and expensive, but they allowed multiple users to access the computer simultaneously through terminals.

Microprocessors and Personal Computers

The invention of the microprocessor in the 1970s marked the beginning of the personal computer era. The Intel 4004, released in 1971, was the first commercially available microprocessor. This development led to the creation of small, relatively inexpensive machines like the Apple II and the IBM PC, which made computing accessible to individuals and small businesses.

The Internet and Beyond

The 1980s and 1990s brought about the rise of the internet and the World Wide Web, expanding the use of computers into every aspect of modern life. The advent of graphical user interfaces, such as Microsoft’s Windows and Apple’s Mac OS, made computers even more user-friendly.

Today, computers have become ubiquitous in our society. They are embedded in everything from our phones to our cars, and they play a critical role in fields ranging from science to entertainment. The history of computers is a story of continuous innovation and progress, and it is clear that this trend will continue into the foreseeable future.

That’s it! I hope the essay helped you.

If you’re looking for more, here are essays on other interesting topics:

  • Essay on Generation of Computer
  • Essay on Computer Technology Good or Bad
  • Essay on Computer Network

Apart from these, you can look at all the essays by clicking here .

Happy studying!

One Comment

I’m so happy and it’s so helpful to me. May God bless you

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy Williamson

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

Error-corrected qubits 800 times more reliable after breakthrough, paving the way for 'next level' of quantum computing

The 7 most powerful supercomputers in the world right now

'Gambling with your life': Experts weigh in on dangers of the Wim Hof method

Most Popular

By Anna Gora December 27, 2023

By Anna Gora December 26, 2023

By Anna Gora December 25, 2023

By Emily Cooke December 23, 2023

By Victoria Atkinson December 22, 2023

By Anna Gora December 16, 2023

By Anna Gora December 15, 2023

By Anna Gora November 09, 2023

By Donavyn Coffey November 06, 2023

By Anna Gora October 31, 2023

By Anna Gora October 26, 2023

  • 2 Error-corrected qubits 800 times more reliable after breakthrough, paving the way for 'next level' of quantum computing
  • 3 Early humans lived on 'Persian plateau' for 20,000 years after leaving Africa, study suggests
  • 4 1,700-year-old Roman fort discovered in Germany was built to keep out barbarians
  • 5 Why NASA is launching 3 rockets into the solar eclipse next week
  • 2 Early humans lived on 'Persian plateau' for 20,000 years after leaving Africa, study suggests
  • 3 'It's had 1.1 billion years to accumulate': Helium reservoir in Minnesota has 'mind-bogglingly large' concentrations
  • 4 Bite from toilet rat hospitalizes man in Canada

essay on computer history

Logo for Clemson University Open Textbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Modern (1940’s-present)

58 History of Computers

Chandler Little

History of Computers

Modern technology first started evolving when electricity started to be used more often in everyday life. One of the biggest inventions in the 20th century was the computer, and it has gone through many changes and improvements since its creation. The last two decades have shown more advancement in technology than any other invention. They have advanced almost every level of learning in our lives and looks like it will only keep impacting through the decades. Computers in today’s society have become a focal point for everyday life and will continue to do so for the foreseeable future. During the evolution of computers, many people have helped with the creation and development, but some people’s contributions have been left out due to social status or lacking some credibility in the field.

Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage. This computer was created with a series of vacuum tubes and weighed a total of 700 pounds, which is much larger than the computers we see today. For example, most laptops weigh in a range of two to eight pounds. A picture of one of the first computers can be seen below in figure 1. There have been large amounts of movement in the data storage sector in computers. The very first hard drive was created in 1956 and had a capacity of 5MB and weighed in at 550 pounds. Today hard drives are becoming smaller and we see them weighing a couple of ounces to a couple of pounds. As files have come more complex, the need for more space in computers has increased drastically. Today we see games take up to 100GB of storage. To give you a reference as to how big of a difference 5MB is to 100GB of storage is that 5MB is .005 GB. The hard drives we have today are seeing sizes from 10TB and larger. A TB is 1000GB. The evolution of the hard drive can be seen in figure 2. As the world of computers keeps progressing, there is a general concept of making them smaller, but at the same time seeing a generational step in improvement. With these large improvements, we see daily tasks from users like teachers, researchers, and doctors become shorter making their tasks quicker and easier to accomplish. With these great advancements in hardware, we are witnessing advancements in software as well. In software development, we are seeing strives in staying connected to others in the ways of social media, messaging platforms, and other means of communication. With all of these advancements in hardware and software, the hope is that we don’t become too reliant on computers. In the wake of a large power outage or an EMP attack, it could cripple our way of life.

essay on computer history

The evolution of computers has been happening at a fast rate, and when this happens people’s contributions are left out. The main demographic in computers that are left out are women. Grace Hopper is one of the most influential people in the computer spectrum, but her work is not shown in the classroom. In the 1950s, Grace Hopper was a senior mathematician on her team for UNIVAC(UNIVersal Automatic  Computing INC). Here she created the very first compiler (Cassel, 2016). This was a massive accomplishment for anyone in the field of computing because it allowed the idea that programming languages are not tied to a specific computer, but can be used on any computer. This single feature in computers was one of the main driving forces for computing to become so robust and powerful that it is today. Grace Hopper’s work needs to be talked about in classrooms not only just in engineering courses, but as well as general classes. Students need to hear that a woman was the driving force behind the evolution of computing. By talking about this, it may urge more women to join the computing field because right now only 25% of jobs in the computing sector are held by women (Cassel, 2016). With a more diverse workforce in computing, we can see the creation of new ideas and features that were never thought of before.

During the evolution of computers, many people have been left out with their creation with respect to the development and algorithms. With the push to gender equality in the world in future years, this gap between the disparity between women’s credibility and men’s credibility will be shrunk to a negligible amount. As computers continue to evolve the world of STS will need to evolve with them to adapt to the changes in technology. If not, some of the great creations in the computer sector will be neglected, and most notoriously here is VR (Virtual Reality) with its higher entry-level price and motion sickness that comes along with VR (Virtual Reality).

How has the advancement in tech improved your life?

REFERENCES                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

A brief history of computers – unipi.it . (n.d.). Retrieved November 7, 2022, from http://digitaltools.labcd.unipi.it/wp-content/uploads/2021/05/A-brief-history-of-computers.pdf

Kleiman, K., About the author Kathy Kleiman is the founder of the ENIAC Programmers Project, & Saklayen, N. (2018, April 19). These 6 pioneering women helped create modern computers. ideas.ted.com. Retrieved September 26, 2021, from https://ideas.ted.com/how-i-discovered-six-pioneering-women-who-helped-create-modern-computers-and-why-we-should-never-forget-them /.

Lillian Cassel. “Op-Ed: 25 Years After Computing Pioneer Grace Hopper’ s Death, We Still Have Work to Do”. USNEWS.com, December 15, 2016 Thursday. advance-lexis-com.libproxy.clemson.edu/api/document?collection=news&id=urn:contentItem:5MDD-4VS1-JCKG-J4GB-00000-00&context= 1516831.

Thompson, C. (2019, June 1). The gendered history of human computers. Smithsonian.com. Retrieved September 26, 2021, from https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/.

Zimmermann, K. A. (2017, September 7). History of computers: A brief timeline. LiveScience. Retrieved September 26, 2021, from https://www.livescience.com/20718-computer-history.html.

(May 26, 2017 Friday). Women in Computing and Women in Engineering honored for promoting girls in STEM. US Official News. https://advance-lexis-com.libproxy.clemson.edu/api/document?collection=news&id=urn:contentItem:5NMW-SXG1-DXCW-D04G-00000-00& context=1516831.

IMAGES                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        

“Gene Amdahl’s first computer.” by Erik Pitti is licensed under CC BY 2.0

“First hard drives” by gabrielsaldana is licensed under CC BY 2.0

To the extent possible under law, Chandler Little has waived all copyright and related or neighboring rights to Science Technology and Society a Student Led Exploration , except where otherwise noted.

Share This Book

Computers: The History of Invention and Development Essay

The invention of the computer in 1948 is often regarded as the beginning of the digital revolution. It is hard to disagree that computers have indeed penetrated into the lives of people have changed them once and for all. Computer technologies have affected every single sphere of human activities starting from entertainment and ending with work and education. They facilitate the work of any enterprise, they are of great assistance for scientists in laboratories, they make it possible to diagnose diseases much faster, they control the work of ATMs, and help the banks to function properly. The first computers occupied almost the whole room and were very slow in processing data and performance in general. The modern world witnesses the development of computer technologies daily with computers turning into tiny machines and working unbelievably smoothly. A computer is now trusted as a best friend and advisor. It is treated as a reliable machine able to process and store a large amount of data and help out in any situation. “The storage, retrieval, and use of information are more important than ever” since “(w)e are in the midst of a profound change, going from hardcopy storage to online storage of the collected knowledge of the human race” (Dave, 2007), which is why the computers are of great assistance to us. However, to become a successful person, it is not enough to simply have a computer at home. It is often the case that people use computers merely to play games without knowing about the wide range of activities they may engage a person in. One has to know more about computers and use all their capabilities for one’s own benefit. Knowing the capabilities of one’s computer can help in the work and educational process, as well as it can save time and money. In this essay, you will find out reasons as to why it is important to know your computer; and how much time and money you will save by using all the capabilities of your computer.

What should be mentioned above all is that knowing one’s computer perfectly gives an opportunity of using it for the most various purposes. It depends on what exactly a person needs a computer for, in other words, whether it is needed for studying, for work, or for entertainment. Using a computer for work or education purposes involves much more than is required for playing computer games. These days most of the students are permitted to submit only typed essays, research papers, and other works, which makes mastering the computer vital. “Information technologies have played a vital role in higher education for decades” (McArthur & Lewis, n.d.); they contributed and still continue to contribute to students’ gaining knowledge from outside sources by means of using the World Wide Web where information is easily accessible and available for everyone. To have access to this information one has to know how to use a computer and to develop certain skills for this. These skills should include, first of all, using a Web browser. “In 1995, Microsoft invented a competing Web browser called Microsoft Internet Explorer” (Walter, n.d.), but there exist other browsers the choice of which depends on the user. Moreover, knowing different search engines (for instance, Google, Yahoo, etc,) is required; the user should also be able to process, analyze, and group similar sources by means of extracting the most relevant information. At this, the user is supposed to know that not all Internet sources should be trusted, especially when the information is gathered for a research paper. Trusting the information presented in ad banners is unwise for their main purpose is attracting the users’ attention. They may contain false or obsolete data misleading the user. Utilizing the information obtained from the Internet for scholarly works, one should remember about plagiarism or responsibility for copying somebody else’s works. Students who use such information should cite it properly and refer to the works of other scholars rather than simply stealing their ideas. Plagiarism is punishable and may result in dropping out of school or college. This testifies to the fact that using a computer for studies demands the acquisition of certain computer programs and practice in working with them, which would give a perfect idea on how to search and process the information needed for completion of different assignments.

What’s more, knowing a computer for work is no less important. Mastering certain computer programs depend on the type of work. Any prestigious work demands a definite level of computer skills from the basic to the advanced one. The work of a company involves sometimes more than using standard computer programs; the software is usually designed specifically for the company depending on the business’s application. This means that acquisition of a special program may be needed and a new worker will have to complete computer courses and gain knowledge on a particular program. Nevertheless, the knowledge of basic computer programs is crucial for getting a job one desires. Since the work of most companies is computerized, one will need to deal with a computer anyways and the skills obtained while playing computer games will not suffice. A person seeking a job should be a confident user of basic computer programs, such as Microsoft Office Word, Microsoft Office Excel, Internet Explorer (or other browsers), etc. A confident user is also supposed to know what to do with the computer when some malfunctions arise. Of course, each company has system administrators who deal with computer defects but minor problems are usually born by the users themselves. Apart from knowing the computer, a person should be aware of the policy of using it in the office. For instance, some companies prohibit using office computers for personal purposes, especially when it comes to downloading software and installing it on the computer without notifying the system administrator. This may be connected either with the fact that incorrectly installed software may harm the system of the computer in general or, if the software has been downloaded from the Internet, it may contain spyware which makes the information from your computer accessible for other users. This can hardly be beneficial for the company dealing with economic, political, governmental, or any other kind of issues. Therefore, knowing a computer is necessary for getting a prestigious job and ensuring proper and safe performance of the company one is working for.

And finally, using all the capabilities of a computer can save time and money. Firstly, a personal computer has a number of tools which facilitate people’s life. Special software, for instance, Microsoft Money, makes it possible to plan the budget, to discover faults in the plan, and correct it easily without having to rewrite it from the beginning; the program itself can manage financial information provided by the user and balance checkbooks in addition. Such computer tools as word processors enable the users to make corrections at any stage of the work; moreover by means of them, one may change the size of letters and overall design of the work to give it a better look. Mapping programs can also be useful; by means of a computer one may install such a program (GPS) into the car; the program then will take care about planning the route avoiding traffic jams and choosing the shortest ways. Secondly, electronic mail allows keeping in touch with people not only in your country but abroad. It is cheaper and much faster than writing letters or communicating over the telephone when the connection is often of low quality and the conversation is constantly interrupted. Most telephone companies are aimed at getting profits from people’s communication with their friends and relatives whereas electronic mail is almost free; all that one needs to do is to pay a monthly fee to the Internet Service Provider. Eventually, computer users have an opportunity to do shopping without leaving the apartment; the choice of the products one may want to buy is practically unlimited and the user can always find recommendations from those people who already purchased the product. A personal computer can also help to save money due to its being multifunctional. Knowing much about the capabilities of the computer, one may start using it as a TV set watching favorite programs online, and as a Playstation playing the same games on the personal computer. Not only can a user watch favorite TV shows by means of his/her computer, but can download them at various torrent sites for free. Using a PC to send faxes through online fax services saves money for one does not have to buy a fax machine and to use an additional telephone line; it also saves paper and ink which one would have to buy otherwise.

Taking into consideration everything mentioned above, it can be stated that knowing a computer is important for it can make people’s life much easier. Firstly, computers are helpful in getting an education since by means of them the students can find any possible information necessary for writing research papers and other kinds of written assignments. To do this, a student needs to know how to search the Internet and to process the information he/she can find there. Secondly, knowing a computer raises one’s chances of getting a good job because most of the companies look for employees with a sufficient level of computer skills. When working for a company one should also remember about its policy regarding the use of computer for personal purposes and be able to cope with minor problems arising in the course of work with the computer. Finally, a computer allows saving time and money. It saves the users’ time due to utilizing such tools as word processors, budget planning, and mapping programs which facilitate the users’ life. The computer can also save money serving as a TV, fax, and Playstation giving access to TV shows, online fax services, and allowing playing video games without buying special devices for this.

McArthur, D., Lewis, W.M., ND. Web.

Moursund, D. (2007). A College Student’s Guide to Computers in Education . Web.

Walter, R. ND. The Secret Guide to Computers . Web.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2021, December 3). Computers: The History of Invention and Development. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/

"Computers: The History of Invention and Development." IvyPanda , 3 Dec. 2021, ivypanda.com/essays/computers-the-history-of-invention-and-development/.

IvyPanda . (2021) 'Computers: The History of Invention and Development'. 3 December.

IvyPanda . 2021. "Computers: The History of Invention and Development." December 3, 2021. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/.

1. IvyPanda . "Computers: The History of Invention and Development." December 3, 2021. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/.

Bibliography

IvyPanda . "Computers: The History of Invention and Development." December 3, 2021. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/.

  • Inventory Management Systems: Equipment & Workflow
  • Social Media for Strategic Business Communication
  • Computer and Information Tech Program in Education
  • Smart Dubai: Creating a Paperless Organization
  • Business Law: Validity of Acceptance
  • Successful and Unsuccessful First-Movers
  • The Innovator’s Dilemma: Open Innovation or Discontinuous Innovation
  • Centre for Disease Control (CDC) Communication Plan
  • Behavior. “The Gender Blur” by Blum and “The Tipping Point” by Gladwell
  • Utilization of E-Marketplaces in Middle East
  • Resource Description and Access (RDA) in Library
  • Macintosh vs. IBM for Personal Usage
  • Analogical Reasoning in Computer Ethics
  • Anti-Trust Cases: International Business Machine Corporation
  • Satisfaction With a Transitional Nursing Home Project

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

The Modern History of Computing

Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine , used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’, initially usually with the prefix ‘electronic’ or ‘digital’. This entry surveys the history of these machines.

  • Analog Computers

The Universal Turing Machine

Electromechanical versus electronic computation, turing's automatic computing engine, the manchester machine, eniac and edvac, other notable early computers, high-speed memory, other internet resources, related entries.

Charles Babbage was Lucasian Professor of Mathematics at Cambridge University from 1828 to 1839 (a post formerly held by Isaac Newton). Babbage's proposed Difference Engine was a special-purpose digital computing machine for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). The Difference Engine consisted entirely of mechanical components — brass gear wheels, rods, ratchets, pinions, etc. Numbers were represented in the decimal system by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model in 1822. He never completed the full-scale machine that he had designed but did complete several fragments. The largest — one ninth of the complete calculator — is on display in the London Science Museum. Babbage used it to perform serious computational work, calculating various mathematical tables. In 1990, Babbage's Difference Engine No. 2 was finally built from Babbage's designs and is also on display at the London Science Museum.

The Swedes Georg and Edvard Scheutz (father and son) constructed a modified version of Babbage's Difference Engine. Three were made, a prototype and two commercial models, one of these being sold to an observatory in Albany, New York, and the other to the Registrar-General's office in London, where it calculated and printed actuarial tables.

Babbage's proposed Analytical Engine, considerably more ambitious than the Difference Engine, was to have been a general-purpose mechanical digital computer. The Analytical Engine was to have had a memory store and a central processing unit (or ‘mill’) and would have been able to select from among alternative actions consequent upon the outcome of its previous actions (a facility nowadays known as conditional branching). The behaviour of the Analytical Engine would have been controlled by a program of instructions contained on punched cards connected together with ribbons (an idea that Babbage had adopted from the Jacquard weaving loom). Babbage emphasised the generality of the Analytical Engine, saying ‘the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine’ (Babbage [1994], p. 97).

Babbage worked closely with Ada Lovelace, daughter of the poet Byron, after whom the modern programming language ADA is named. Lovelace foresaw the possibility of using the Analytical Engine for non-numeric computation, suggesting that the Engine might even be capable of composing elaborate pieces of music.

A large model of the Analytical Engine was under construction at the time of Babbage's death in 1871 but a full-scale version was never built. Babbage's idea of a general-purpose calculating engine was never forgotten, especially at Cambridge, and was on occasion a lively topic of mealtime discussion at the war-time headquarters of the Government Code and Cypher School, Bletchley Park, Buckinghamshire, birthplace of the electronic digital computer.

Analog computers

The earliest computing machines in wide use were not digital but analog. In analog representation, properties of the representational medium ape (or reflect or model) properties of the represented state-of-affairs. (In obvious contrast, the strings of binary digits employed in digital representation do not represent by means of possessing some physical property — such as length — whose magnitude varies in proportion to the magnitude of the property that is being represented.) Analog representations form a diverse class. Some examples: the longer a line on a road map, the longer the road that the line represents; the greater the number of clear plastic squares in an architect's model, the greater the number of windows in the building represented; the higher the pitch of an acoustic depth meter, the shallower the water. In analog computers, numerical quantities are represented by, for example, the angle of rotation of a shaft or a difference in electrical potential. Thus the output voltage of the machine at a time might represent the momentary speed of the object being modelled.

As the case of the architect's model makes plain, analog representation may be discrete in nature (there is no such thing as a fractional number of windows). Among computer scientists, the term ‘analog’ is sometimes used narrowly, to indicate representation of one continuously-valued quantity by another (e.g., speed by voltage). As Brian Cantwell Smith has remarked:

‘Analog’ should … be a predicate on a representation whose structure corresponds to that of which it represents … That continuous representations should historically have come to be called analog presumably betrays the recognition that, at the levels at which it matters to us, the world is more foundationally continuous than it is discrete. (Smith [1991], p. 271)

James Thomson, brother of Lord Kelvin, invented the mechanical wheel-and-disc integrator that became the foundation of analog computation (Thomson [1876]). The two brothers constructed a device for computing the integral of the product of two given functions, and Kelvin described (although did not construct) general-purpose analog machines for integrating linear differential equations of any order and for solving simultaneous linear equations. Kelvin's most successful analog computer was his tide predicting machine, which remained in use at the port of Liverpool until the 1960s. Mechanical analog devices based on the wheel-and-disc integrator were in use during World War I for gunnery calculations. Following the war, the design of the integrator was considerably improved by Hannibal Ford (Ford [1919]).

Stanley Fifer reports that the first semi-automatic mechanical analog computer was built in England by the Manchester firm of Metropolitan Vickers prior to 1930 (Fifer [1961], p. 29); however, I have so far been unable to verify this claim. In 1931, Vannevar Bush, working at MIT, built the differential analyser, the first large-scale automatic general-purpose mechanical analog computer. Bush's design was based on the wheel and disc integrator. Soon copies of his machine were in use around the world (including, at Cambridge and Manchester Universities in England, differential analysers built out of kit-set Meccano, the once popular engineering toy).

It required a skilled mechanic equipped with a lead hammer to set up Bush's mechanical differential analyser for each new job. Subsequently, Bush and his colleagues replaced the wheel-and-disc integrators and other mechanical components by electromechanical, and finally by electronic, devices.

A differential analyser may be conceptualised as a collection of ‘black boxes’ connected together in such a way as to allow considerable feedback. Each box performs a fundamental process, for example addition, multiplication of a variable by a constant, and integration. In setting up the machine for a given task, boxes are connected together so that the desired set of fundamental processes is executed. In the case of electrical machines, this was done typically by plugging wires into sockets on a patch panel (computing machines whose function is determined in this way are referred to as ‘program-controlled’).

Since all the boxes work in parallel, an electronic differential analyser solves sets of equations very quickly. Against this has to be set the cost of massaging the problem to be solved into the form demanded by the analog machine, and of setting up the hardware to perform the desired computation. A major drawback of analog computation is the higher cost, relative to digital machines, of an increase in precision. During the 1960s and 1970s, there was considerable interest in ‘hybrid’ machines, where an analog section is controlled by and programmed via a digital section. However, such machines are now a rarity.

In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]). The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols. This is Turing's stored-program concept, and implicit in it is the possibility of the machine operating on and modifying its own program. (In London in 1947, in the course of what was, so far as is known, the earliest public lecture to mention computer intelligence, Turing said, ‘What we want is a machine that can learn from experience’, adding that the ‘possibility of letting the machine alter its own instructions provides the mechanism for this’ (Turing [1947] p. 393). Turing's computing machine of 1936 is now known simply as the universal Turing machine. Cambridge mathematician Max Newman remarked that right from the start Turing was interested in the possibility of actually building a computing machine of the sort that he had described (Newman in interview with Christopher Evans in Evans [197?].

From the start of the Second World War Turing was a leading cryptanalyst at the Government Code and Cypher School, Bletchley Park. Here he became familiar with Thomas Flowers' work involving large-scale high-speed electronic switching (described below). However, Turing could not turn to the project of building an electronic stored-program computing machine until the cessation of hostilities in Europe in 1945.

During the wartime years Turing did give considerable thought to the question of machine intelligence. Colleagues at Bletchley Park recall numerous off-duty discussions with him on the topic, and at one point Turing circulated a typewritten report (now lost) setting out some of his ideas. One of these colleagues, Donald Michie (who later founded the Department of Machine Intelligence and Perception at the University of Edinburgh), remembers Turing talking often about the possibility of computing machines (1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles (Michie in interview with Copeland, 1995). The modern term for the latter idea is ‘heuristic search’, a heuristic being any rule-of-thumb principle that cuts down the amount of searching required in order to find a solution to a problem. At Bletchley Park Turing illustrated his ideas on machine intelligence by reference to chess. Michie recalls Turing experimenting with heuristics that later became common in chess programming (in particular minimax and best-first).

Further information about Turing and the computer, including his wartime work on codebreaking and his thinking about artificial intelligence and artificial life, can be found in Copeland 2004.

With some exceptions — including Babbage's purely mechanical engines, and the finger-powered National Accounting Machine - early digital computing machines were electromechanical. That is to say, their basic components were small, electrically-driven, mechanical switches called ‘relays’. These operate relatively slowly, whereas the basic components of an electronic computer — originally vacuum tubes (valves) — have no moving parts save electrons and so operate extremely fast. Electromechanical digital computing machines were built before and during the second world war by (among others) Howard Aiken at Harvard University, George Stibitz at Bell Telephone Laboratories, Turing at Princeton University and Bletchley Park, and Konrad Zuse in Berlin. To Zuse belongs the honour of having built the first working general-purpose program-controlled digital computer. This machine, later called the Z3, was functioning in 1941. (A program-controlled computer, as opposed to a stored-program computer, is set up for a new task by re-routing wires, by means of plugs etc.)

Relays were too slow and unreliable a medium for large-scale general-purpose digital computation (although Aiken made a valiant effort). It was the development of high-speed digital techniques using vacuum tubes that made the modern computer possible.

The earliest extensive use of vacuum tubes for digital data-processing appears to have been by the engineer Thomas Flowers, working in London at the British Post Office Research Station at Dollis Hill. Electronic equipment designed by Flowers in 1934, for controlling the connections between telephone exchanges, went into operation in 1939, and involved between three and four thousand vacuum tubes running continuously. In 1938–1939 Flowers worked on an experimental electronic digital data-processing system, involving a high-speed data store. Flowers' aim, achieved after the war, was that electronic equipment should replace existing, less reliable, systems built from relays and used in telephone exchanges. Flowers did not investigate the idea of using electronic equipment for numerical calculation, but has remarked that at the outbreak of war with Germany in 1939 he was possibly the only person in Britain who realized that vacuum tubes could be used on a large scale for high-speed digital computation. (See Copeland 2006 for m more information on Flowers' work.)

The earliest comparable use of vacuum tubes in the U.S. seems to have been by John Atanasoff at what was then Iowa State College (now University). During the period 1937–1942 Atanasoff developed techniques for using vacuum tubes to perform numerical calculations digitally. In 1939, with the assistance of his student Clifford Berry, Atanasoff began building what is sometimes called the Atanasoff-Berry Computer, or ABC, a small-scale special-purpose electronic digital machine for the solution of systems of linear algebraic equations. The machine contained approximately 300 vacuum tubes. Although the electronic part of the machine functioned successfully, the computer as a whole never worked reliably, errors being introduced by the unsatisfactory binary card-reader. Work was discontinued in 1942 when Atanasoff left Iowa State.

The first fully functioning electronic digital computer was Colossus, used by the Bletchley Park cryptanalysts from February 1944.

From very early in the war the Government Code and Cypher School (GC&CS) was successfully deciphering German radio communications encoded by means of the Enigma system, and by early 1942 about 39,000 intercepted messages were being decoded each month, thanks to electromechanical machines known as ‘bombes’. These were designed by Turing and Gordon Welchman (building on earlier work by Polish cryptanalysts).

During the second half of 1941, messages encoded by means of a totally different method began to be intercepted. This new cipher machine, code-named ‘Tunny’ by Bletchley Park, was broken in April 1942 and current traffic was read for the first time in July of that year. Based on binary teleprinter code, Tunny was used in preference to Morse-based Enigma for the encryption of high-level signals, for example messages from Hitler and members of the German High Command.

The need to decipher this vital intelligence as rapidly as possible led Max Newman to propose in November 1942 (shortly after his recruitment to GC&CS from Cambridge University) that key parts of the decryption process be automated, by means of high-speed electronic counting devices. The first machine designed and built to Newman's specification, known as the Heath Robinson, was relay-based with electronic circuits for counting. (The electronic counters were designed by C.E. Wynn-Williams, who had been using thyratron tubes in counting circuits at the Cavendish Laboratory, Cambridge, since 1932 [Wynn-Williams 1932].) Installed in June 1943, Heath Robinson was unreliable and slow, and its high-speed paper tapes were continually breaking, but it proved the worth of Newman's idea. Flowers recommended that an all-electronic machine be built instead, but he received no official encouragement from GC&CS. Working independently at the Post Office Research Station at Dollis Hill, Flowers quietly got on with constructing the world's first large-scale programmable electronic digital computer. Colossus I was delivered to Bletchley Park in January 1943.

By the end of the war there were ten Colossi working round the clock at Bletchley Park. From a cryptanalytic viewpoint, a major difference between the prototype Colossus I and the later machines was the addition of the so-called Special Attachment, following a key discovery by cryptanalysts Donald Michie and Jack Good. This broadened the function of Colossus from ‘wheel setting’ — i.e., determining the settings of the encoding wheels of the Tunny machine for a particular message, given the ‘patterns’ of the wheels — to ‘wheel breaking’, i.e., determining the wheel patterns themselves. The wheel patterns were eventually changed daily by the Germans on each of the numerous links between the German Army High Command and Army Group commanders in the field. By 1945 there were as many 30 links in total. About ten of these were broken and read regularly.

Colossus I contained approximately 1600 vacuum tubes and each of the subsequent machines approximately 2400 vacuum tubes. Like the smaller ABC, Colossus lacked two important features of modern computers. First, it had no internally stored programs. To set it up for a new task, the operator had to alter the machine's physical wiring, using plugs and switches. Second, Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations.

F.H. Hinsley, official historian of GC&CS, has estimated that the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park, in which Colossus played a major role. Most of the Colossi were destroyed once hostilities ceased. Some of the electronic panels ended up at Newman's Computing Machine Laboratory in Manchester (see below), all trace of their original use having been removed. Two Colossi were retained by GC&CS (renamed GCHQ following the end of the war). The last Colossus is believed to have stopped running in 1960.

Those who knew of Colossus were prohibited by the Official Secrets Act from sharing their knowledge. Until the 1970s, few had any idea that electronic computation had been used successfully during the second world war. In 1970 and 1975, respectively, Good and Michie published notes giving the barest outlines of Colossus. By 1983, Flowers had received clearance from the British Government to publish a partial account of the hardware of Colossus I. Details of the later machines and of the Special Attachment, the uses to which the Colossi were put, and the cryptanalytic algorithms that they ran, have only recently been declassified. (For the full account of Colossus and the attack on Tunny see Copeland 2006.)

To those acquainted with the universal Turing machine of 1936, and the associated stored-program concept, Flowers' racks of digital electronic equipment were proof of the feasibility of using large numbers of vacuum tubes to implement a high-speed general-purpose stored-program computer. The war over, Newman lost no time in establishing the Royal Society Computing Machine Laboratory at Manchester University for precisely that purpose. A few months after his arrival at Manchester, Newman wrote as follows to the Princeton mathematician John von Neumann (February 1946):

I am … hoping to embark on a computing machine section here, having got very interested in electronic devices of this kind during the last two or three years. By about eighteen months ago I had decided to try my hand at starting up a machine unit when I got out. … I am of course in close touch with Turing.

Turing and Newman were thinking along similar lines. In 1945 Turing joined the National Physical Laboratory (NPL) in London, his brief to design and develop an electronic stored-program digital computer for scientific work. (Artificial Intelligence was not far from Turing's thoughts: he described himself as ‘building a brain’ and remarked in a letter that he was ‘more interested in the possibility of producing models of the action of the brain than in the practical applications to computing’.) John Womersley, Turing's immediate superior at NPL, christened Turing's proposed machine the Automatic Computing Engine, or ACE, in homage to Babbage's Difference Engine and Analytical Engine.

Turing's 1945 report ‘Proposed Electronic Calculator’ gave the first relatively complete specification of an electronic stored-program general-purpose digital computer. The report is reprinted in full in Copeland 2005.

The first electronic stored-program digital computer to be proposed in the U.S. was the EDVAC (see below). The ‘First Draft of a Report on the EDVAC’ (May 1945), composed by von Neumann, contained little engineering detail, in particular concerning electronic hardware (owing to restrictions in the U.S.). Turing's ‘Proposed Electronic Calculator’, on the other hand, supplied detailed circuit designs and specifications of hardware units, specimen programs in machine code, and even an estimate of the cost of building the machine (£11,200). ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralised structure.

Turing saw that speed and memory were the keys to computing. Turing's colleague at NPL, Jim Wilkinson, observed that Turing ‘was obsessed with the idea of speed on the machine’ [Copeland 2005, p. 2]. Turing's design had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day). Had Turing's ACE been built as planned it would have been in a different league from the other early computers. However, progress on Turing's Automatic Computing Engine ran slowly, due to organisational difficulties at NPL, and in 1948 a ‘very fed up’ Turing (Robin Gandy's description, in interview with Copeland, 1995) left NPL for Newman's Computing Machine Laboratory at Manchester University. It was not until May 1950 that a small pilot model of the Automatic Computing Engine, built by Wilkinson, Edward Newman, Mike Woodger, and others, first executed a program. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.

Sales of DEUCE, the production version of the Pilot Model ACE, were buoyant — confounding the suggestion, made in 1946 by the Director of the NPL, Sir Charles Darwin, that ‘it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country’ [Copeland 2005, p. 4]. The fundamentals of Turing's ACE design were employed by Harry Huskey (at Wayne State University, Detroit) in the Bendix G15 computer (Huskey in interview with Copeland, 1998). The G15 was arguably the first personal computer; over 400 were sold worldwide. DEUCE and the G15 remained in use until about 1970. Another computer deriving from Turing's ACE design, the MOSAIC, played a role in Britain's air defences during the Cold War period; other derivatives include the Packard-Bell PB250 (1961). (More information about these early computers is given in [Copeland 2005].)

The earliest general-purpose stored-program electronic digital computer to work was built in Newman's Computing Machine Laboratory at Manchester University. The Manchester ‘Baby’, as it became known, was constructed by the engineers F.C. Williams and Tom Kilburn, and performed its first calculation on 21 June 1948. The tiny program, stored on the face of a cathode ray tube, was just seventeen instructions long. A much enlarged version of the machine, with a programming system designed by Turing, became the world's first commercially available computer, the Ferranti Mark I. The first to be completed was installed at Manchester University in February 1951; in all about ten were sold, in Britain, Canada, Holland and Italy.

The fundamental logico-mathematical contributions by Turing and Newman to the triumph at Manchester have been neglected, and the Manchester machine is nowadays remembered as the work of Williams and Kilburn. Indeed, Newman's role in the development of computers has never been sufficiently emphasised (due perhaps to his thoroughly self-effacing way of relating the relevant events).

It was Newman who, in a lecture in Cambridge in 1935, introduced Turing to the concept that led directly to the Turing machine: Newman defined a constructive process as one that a machine can carry out (Newman in interview with Evans, op. cit.). As a result of his knowledge of Turing's work, Newman became interested in the possibilities of computing machinery in, as he put it, ‘a rather theoretical way’. It was not until Newman joined GC&CS in 1942 that his interest in computing machinery suddenly became practical, with his realisation that the attack on Tunny could be mechanised. During the building of Colossus, Newman tried to interest Flowers in Turing's 1936 paper — birthplace of the stored-program concept - but Flowers did not make much of Turing's arcane notation. There is no doubt that by 1943, Newman had firmly in mind the idea of using electronic technology in order to construct a stored-program general-purpose digital computing machine.

In July of 1946 (the month in which the Royal Society approved Newman's application for funds to found the Computing Machine Laboratory), Freddie Williams, working at the Telecommunications Research Establishment, Malvern, began the series of experiments on cathode ray tube storage that was to lead to the Williams tube memory. Williams, until then a radar engineer, explains how it was that he came to be working on the problem of computer memory:

[O]nce [the German Armies] collapsed … nobody was going to care a toss about radar, and people like me … were going to be in the soup unless we found something else to do. And computers were in the air. Knowing absolutely nothing about them I latched onto the problem of storage and tackled that. (Quoted in Bennett 1976.)

Newman learned of Williams' work, and with the able help of Patrick Blackett, Langworthy Professor of Physics at Manchester and one of the most powerful figures in the University, was instrumental in the appointment of the 35 year old Williams to the recently vacated Chair of Electro-Technics at Manchester. (Both were members of the appointing committee (Kilburn in interview with Copeland, 1997).) Williams immediately had Kilburn, his assistant at Malvern, seconded to Manchester. To take up the story in Williams' own words:

[N]either Tom Kilburn nor I knew the first thing about computers when we arrived in Manchester University. We'd had enough explained to us to understand what the problem of storage was and what we wanted to store, and that we'd achieved, so the point now had been reached when we'd got to find out about computers … Newman explained the whole business of how a computer works to us. (F.C. Williams in interview with Evans [1976])

Elsewhere Williams is explicit concerning Turing's role and gives something of the flavour of the explanation that he and Kilburn received:

Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing … knew a lot about computers and substantially nothing about electronics. They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation. (Williams [1975], p. 328)

It seems that Newman must have used much the same words with Williams and Kilburn as he did in an address to the Royal Society on 4th March 1948:

Professor Hartree … has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage's plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing … [T]he machines now being made in America and in this country … [are] in certain general respects … all similar. There is provision for storing numbers, say in the scale of 2, so that each number appears as a row of, say, forty 0's and 1's in certain places or "houses" in the machine. … Certain of these numbers, or "words" are read, one after another, as orders. In one possible type of machine an order consists of four numbers, for example 11, 13, 27, 4. The number 4 signifies "add", and when control shifts to this word the "houses" H11 and H13 will be connected to the adder as inputs, and H27 as output. The numbers stored in H11 and H13 pass through the adder, are added, and the sum is passed on to H27. The control then shifts to the next order. In most real machines the process just described would be done by three separate orders, the first bringing [H11] (=content of H11) to a central accumulator, the second adding [H13] into the accumulator, and the third sending the result to H27; thus only one address would be required in each order. … A machine with storage, with this automatic-telephone-exchange arrangement and with the necessary adders, subtractors and so on, is, in a sense, already a universal machine. (Newman [1948], pp. 271–272)

Following this explanation of Turing's three-address concept (source 1, source 2, destination, function) Newman went on to describe program storage (‘the orders shall be in a series of houses X1, X2, …’) and conditional branching. He then summed up:

From this highly simplified account it emerges that the essential internal parts of the machine are, first, a storage for numbers (which may also be orders). … Secondly, adders, multipliers, etc. Thirdly, an "automatic telephone exchange" for selecting "houses", connecting them to the arithmetic organ, and writing the answers in other prescribed houses. Finally, means of moving control at any stage to any chosen order, if a certain condition is satisfied, otherwise passing to the next order in the normal sequence. Besides these there must be ways of setting up the machine at the outset, and extracting the final answer in useable form. (Newman [1948], pp. 273–4)

In a letter written in 1972 Williams described in some detail what he and Kilburn were told by Newman:

About the middle of the year [1946] the possibility of an appointment at Manchester University arose and I had a talk with Professor Newman who was already interested in the possibility of developing computers and had acquired a grant from the Royal Society of £30,000 for this purpose. Since he understood computers and I understood electronics the possibilities of fruitful collaboration were obvious. I remember Newman giving us a few lectures in which he outlined the organisation of a computer in terms of numbers being identified by the address of the house in which they were placed and in terms of numbers being transferred from this address, one at a time, to an accumulator where each entering number was added to what was already there. At any time the number in the accumulator could be transferred back to an assigned address in the store and the accumulator cleared for further use. The transfers were to be effected by a stored program in which a list of instructions was obeyed sequentially. Ordered progress through the list could be interrupted by a test instruction which examined the sign of the number in the accumulator. Thereafter operation started from a new point in the list of instructions. This was the first information I received about the organisation of computers. … Our first computer was the simplest embodiment of these principles, with the sole difference that it used a subtracting rather than an adding accumulator. (Letter from Williams to Randell, 1972; in Randell [1972], p. 9)

Turing's early input to the developments at Manchester, hinted at by Williams in his above-quoted reference to Turing, may have been via the lectures on computer design that Turing and Wilkinson gave in London during the period December 1946 to February 1947 (Turing and Wilkinson [1946–7]). The lectures were attended by representatives of various organisations planning to use or build an electronic computer. Kilburn was in the audience (Bowker and Giordano [1993]). (Kilburn usually said, when asked from where he obtained his basic knowledge of the computer, that he could not remember (letter from Brian Napper to Copeland, 2002); for example, in a 1992 interview he said: ‘Between early 1945 and early 1947, in that period, somehow or other I knew what a digital computer was … Where I got this knowledge from I've no idea’ (Bowker and Giordano [1993], p. 19).)

Whatever role Turing's lectures may have played in informing Kilburn, there is little doubt that credit for the Manchester computer — called the ‘Newman-Williams machine’ in a contemporary document (Huskey 1947) — belongs not only to Williams and Kilburn but also to Newman, and that the influence on Newman of Turing's 1936 paper was crucial, as was the influence of Flowers' Colossus.

The first working AI program, a draughts (checkers) player written by Christopher Strachey, ran on the Ferranti Mark I in the Manchester Computing Machine Laboratory. Strachey (at the time a teacher at Harrow School and an amateur programmer) wrote the program with Turing's encouragement and utilising the latter's recently completed Programmers' Handbook for the Ferranti. (Strachey later became Director of the Programming Research Group at Oxford University.) By the summer of 1952, the program could, Strachey reported, ‘play a complete game of draughts at a reasonable speed’. (Strachey's program formed the basis for Arthur Samuel's well-known checkers program.) The first chess-playing program, also, was written for the Manchester Ferranti, by Dietrich Prinz; the program first ran in November 1951. Designed for solving simple problems of the mate-in-two variety, the program would examine every possible move until a solution was found. Turing started to program his ‘Turochamp’ chess-player on the Ferranti Mark I, but never completed the task. Unlike Prinz's program, the Turochamp could play a complete game (when hand-simulated) and operated not by exhaustive search but under the guidance of heuristics.

The first fully functioning electronic digital computer to be built in the U.S. was ENIAC, constructed at the Moore School of Electrical Engineering, University of Pennsylvania, for the Army Ordnance Department, by J. Presper Eckert and John Mauchly. Completed in 1945, ENIAC was somewhat similar to the earlier Colossus, but considerably larger and more flexible (although far from general-purpose). The primary function for which ENIAC was designed was the calculation of tables used in aiming artillery. ENIAC was not a stored-program computer, and setting it up for a new job involved reconfiguring the machine by means of plugs and switches. For many years, ENIAC was believed to have been the first functioning electronic digital computer, Colossus being unknown to all but a few.

In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.

Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.

The Los Alamos physicist Stanley Frankel, responsible with von Neumann and others for mechanising the large-scale calculations involved in the design of the atomic bomb, has described von Neumann's view of the importance of Turing's 1936 paper, in a letter:

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 … Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing, in so far as not anticipated by Babbage … Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. (Quoted in Randell [1972], p. 10)

Other notable early stored-program electronic digital computers were:

  • EDSAC, 1949, built at Cambridge University by Maurice Wilkes
  • BINAC, 1949, built by Eckert's and Mauchly's Electronic Control Co., Philadelphia (opinions differ over whether BINAC ever actually worked)
  • Whirlwind I, 1949, Digital Computer Laboratory, Massachusetts Institute of Technology, Jay Forrester
  • SEAC, 1950, US Bureau of Standards Eastern Division, Washington D.C., Samuel Alexander, Ralph Slutz
  • SWAC, 1950, US Bureau of Standards Western Division, Institute for Numerical Analysis, University of California at Los Angeles, Harry Huskey
  • UNIVAC, 1951, Eckert-Mauchly Computer Corporation, Philadelphia (the first computer to be available commercially in the U.S.)
  • the IAS computer, 1952, Institute for Advanced Study, Princeton University, Julian Bigelow, Arthur Burks, Herman Goldstine, von Neumann, and others (thanks to von Neumann's publishing the specifications of the IAS machine, it became the model for a group of computers known as the Princeton Class machines; the IAS computer was also a strong influence on the IBM 701)
  • IBM 701, 1952, International Business Machine's first mass-produced electronic stored-program computer.

The EDVAC and ACE proposals both advocated the use of mercury-filled tubes, called ‘delay lines’, for high-speed internal memory. This form of memory is known as acoustic memory. Delay lines had initially been developed for echo cancellation in radar; the idea of using them as memory devices originated with Eckert at the Moore School. Here is Turing's description:

It is proposed to build "delay line" units consisting of mercury … tubes about 5′ long and 1″ in diameter in contact with a quartz crystal at each end. The velocity of sound in … mercury … is such that the delay will be 1.024 ms. The information to be stored may be considered to be a sequence of 1024 ‘digits’ (0 or 1) … These digits will be represented by a corresponding sequence of pulses. The digit 0 … will be represented by the absence of a pulse at the appropriate time, the digit 1 … by its presence. This series of pulses is impressed on the end of the line by one piezo-crystal, it is transmitted down the line in the form of supersonic waves, and is reconverted into a varying voltage by the crystal at the far end. This voltage is amplified sufficiently to give an output of the order of 10 volts peak to peak and is used to gate a standard pulse generated by the clock. This pulse may be again fed into the line by means of the transmitting crystal, or we may feed in some altogether different signal. We also have the possibility of leading the gated pulse to some other part of the calculator, if we have need of that information at the time. Making use of the information does not of course preclude keeping it also. (Turing [1945], p. 375)

Mercury delay line memory was used in EDSAC, BINAC, SEAC, Pilot Model ACE, EDVAC, DEUCE, and full-scale ACE (1958). The chief advantage of the delay line as a memory medium was, as Turing put it, that delay lines were "already a going concern" (Turing [1947], p. 380). The fundamental disadvantages of the delay line were that random access is impossible and, moreover, the time taken for an instruction, or number, to emerge from a delay line depends on where in the line it happens to be.

In order to minimize waiting-time, Turing arranged for instructions to be stored not in consecutive positions in the delay line, but in relative positions selected by the programmer in such a way that each instruction would emerge at exactly the time it was required, in so far as this was possible. Each instruction contained a specification of the location of the next. This system subsequently became known as ‘optimum coding’. It was an integral feature of every version of the ACE design. Optimum coding made for difficult and untidy programming, but the advantage in terms of speed was considerable. Thanks to optimum coding, the Pilot Model ACE was able to do a floating point multiplication in 3 milliseconds (Wilkes's EDSAC required 4.5 milliseconds to perform a single fixed point multiplication).

In the Williams tube or electrostatic memory, previously mentioned, a two-dimensional rectangular array of binary digits was stored on the face of a commercially-available cathode ray tube. Access to data was immediate. Williams tube memories were employed in the Manchester series of machines, SWAC, the IAS computer, and the IBM 701, and a modified form of Williams tube in Whirlwind I (until replacement by magnetic core in 1953).

Drum memories, in which data was stored magnetically on the surface of a metal cylinder, were developed on both sides of the Atlantic. The initial idea appears to have been Eckert's. The drum provided reasonably large quantities of medium-speed memory and was used to supplement a high-speed acoustic or electrostatic memory. In 1949, the Manchester computer was successfully equipped with a drum memory; this was constructed by the Manchester engineers on the model of a drum developed by Andrew Booth at Birkbeck College, London.

The final major event in the early history of electronic computation was the development of magnetic core memory. Jay Forrester realised that the hysteresis properties of magnetic core (normally used in transformers) lent themselves to the implementation of a three-dimensional solid array of randomly accessible storage points. In 1949, at Massachusetts Institute of Technology, he began to investigate this idea empirically. Forrester's early experiments with metallic core soon led him to develop the superior ferrite core memory. Digital Equipment Corporation undertook to build a computer similar to the Whirlwind I as a test vehicle for a ferrite core memory. The Memory Test Computer was completed in 1953. (This computer was used in 1954 for the first simulations of neural networks, by Belmont Farley and Wesley Clark of MIT's Lincoln Laboratory (see Copeland and Proudfoot [1996]).

Once the absolute reliability, relative cheapness, high capacity and permanent life of ferrite core memory became apparent, core soon replaced other forms of high-speed memory. The IBM 704 and 705 computers (announced in May and October 1954, respectively) brought core memory into wide use.

Works Cited

  • Babbage, C. (ed. by Campbell-Kelly, M.), 1994, Passages from the Life of a Philosopher , New Brunswick: Rutgers University Press
  • Bennett, S., 1976, ‘F.C. Williams: his contribution to the development of automatic control’, National Archive for the History of Computing, University of Manchester, England. (This is a typescript based on interviews with Williams in 1976.)
  • Bowker, G., and Giordano, R., 1993, ‘Interview with Tom Kilburn’, Annals of the History of Computing , 15 : 17–32.
  • Copeland, B.J. (ed.), 2004, The Essential Turing Oxford University Press
  • Copeland, B.J. (ed.), 2005, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer Oxford University Press
  • Copeland, B.J. and others, 2006, Colossus: The Secrets of Bletchley Park's Codebreaking Computers Oxford University Press
  • Copeland, B.J., and Proudfoot, D., 1996, ‘On Alan Turing's Anticipation of Connectionism’ Synthese , 108 : 361–377
  • Evans, C., 197?, interview with M.H.A. Newman in ‘The Pioneers of Computing: an Oral History of Computing’, London: Science Museum
  • Fifer, S., 1961, Analog Computation: Theory, Techniques, Applications New York: McGraw-Hill
  • Ford, H., 1919, ‘Mechanical Movement’, Official Gazette of the United States Patent Office , October 7, 1919: 48
  • Goldstine, H., 1972, The Computer from Pascal to von Neumann Princeton University Press
  • Huskey, H.D., 1947, ‘The State of the Art in Electronic Digital Computing in Britain and the United States’, in [Copeland 2005]
  • Newman, M.H.A., 1948, ‘General Principles of the Design of All-Purpose Computing Machines’ Proceedings of the Royal Society of London , series A, 195 (1948): 271–274
  • Randell, B., 1972, ‘On Alan Turing and the Origins of Digital Computers’, in Meltzer, B., Michie, D. (eds), Machine Intelligence 7 , Edinburgh: Edinburgh University Press, 1972
  • Smith, B.C., 1991, ‘The Owl and the Electric Encyclopaedia’, Artificial Intelligence , 47 : 251–288
  • Thomson, J., 1876, ‘On an Integrating Machine Having a New Kinematic Principle’ Proceedings of the Royal Society of London , 24 : 262–5
  • Turing, A.M., 1936, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ Proceedings of the London Mathematical Society , Series 2, 42 (1936–37): 230–265. Reprinted in The Essential Turing (Copeland [2004]).
  • Turing, A.M, 1945, ‘Proposed Electronic Calculator’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • Turing, A.M., 1947, ‘Lecture on the Automatic Computing Engine’, in The Essential Turing (Copeland [2004])
  • Turing, A.M., and Wilkinson, J.H., 1946–7, ‘The Turing-Wilkinson Lecture Series (1946-7)’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • von Neumann, J., 1945, ‘First Draft of a Report on the EDVAC’, in Stern, N. From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers Bedford, Mass.: Digital Press (1981), pp. 181–246
  • Williams, F.C., 1975, ‘Early Computers at Manchester University’ The Radio and Electronic Engineer , 45 (1975): 237–331
  • Wynn-Williams, C.E., 1932, ‘A Thyratron "Scale of Two" Automatic Counter’ Proceedings of the Royal Society of London , series A, 136 : 312–324

Further Reading

  • Copeland, B.J., 2004, ‘Colossus — Its Origins and Originators’ Annals of the History of Computing , 26 : 38–45
  • Metropolis, N., Howlett, J., Rota, G.C. (eds), 1980, A History of Computing in the Twentieth Century New York: Academic Press
  • Randell, B. (ed.), 1982, The Origins of Digital Computers: Selected Papers Berlin: Springer-Verlag
  • Williams, M.R., 1997, A History of Computing Technology Los Alamitos: IEEE Computer Society Press
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Turing Archive for the History of Computing
  • The Alan Turing Home Page
  • Australian Computer Museum Society
  • The Bletchley Park Home Page
  • Charles Babbage Institute
  • Computational Logic Group at St. Andrews
  • The Computer Conservation Society (UK)
  • CSIRAC (a.k.a. CSIR MARK I) Home Page
  • Frode Weierud's CryptoCellar
  • Logic and Computation Group at Penn
  • National Archive for the History of Computing
  • National Cryptologic Museum

computability and complexity | recursive functions | Turing, Alan | Turing machines

Copyright © 2006 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Home — Essay Samples — Information Science and Technology — Computer — The History Of Computing

test_template

The History of Computing

  • Categories: Computer What Is History

About this sample

close

Words: 392 |

Published: Jan 8, 2020

Words: 392 | Page: 1 | 2 min read

Image of Alex Wood

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Dr Jacklynne

Verified writer

  • Expert in: Information Science and Technology History

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

6 pages / 2892 words

2 pages / 890 words

6 pages / 3163 words

2 pages / 1050 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

The History of Computing Essay

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Computer

Internet of Things (IoT) is a system of integrated technology that authorizes interaction of distinctively connected computing devise which could be rooted with other interfaces like humans or machines, associated via wired and [...]

The ASUS VivoBook F510UA FHD is a laptop with nice esthetics and a solid feature set, easily rivaling some of its premium competitors. The design and build quality of the laptop are really good. It’s easily one of the best [...]

Fortnite has undoubtedly risen to become one of the most popular ongoing games. It has become a cult in itself and has been able to attract crowds from multiple age brackets into this super engrossing video game. The game was [...]

One of the primary input device used with a computer that looks similar to those found on electric type-writers is a computer keyboard, but with some additional keys. Keyboard allows you to input letter, number and other symbol [...]

PHP is a well-known web programming language that has been widely used to create exceptional websites. According to W3Techs survey, PHP is being used by 82% of majority web server. This language is popular because of its [...]

The intention of my research is to carry out a systematic inquiry to discover and examine the facts about models in architecture such as digital, through computers programs or physical, using the traditional approach, this means [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

essay on computer history

Essay on History of computers/Evolution of Computers

Essay on History of computers/Evolution of Computers 400-500 words

Essay on History of computers

While computers are now an important part of human life, there was a time when computers did not exist. Knowing the history of computers and their progress can help us understand how complex and innovative computer manufacturing is.

Unlike most devices, the computer is one of the few inventions that does not have a specific inventor. During the development of computers, many people have added their creations to the list of essentials for a computer to work. Some of the inventions have been different types of computers, and some of them were parts that allowed the computer to be further developed.

Perhaps the most important date in the history of computers is 1936. In the same year, the first “computer” was developed. It was created by Konrad Zuse and dubbed the Z1 computer. This computer stands as the first because it was the first fully programmable system. There were devices before this, but none had the computing power that differentiates it from other electronics.

No business had seen profit and opportunity in computers until 1942. This first company was called ABC Computer, which was owned and operated by John Atanasoff and Clifford Berry. Two years later, the Harvard Mark I computer was developed, advancing the science of computing.

During the next few years, inventors around the world began to discover more about computers to study and how to improve upon them. They call the introduction of the transistor in the next ten years, which would become an important part of the inner workings of the computer, the ENIAC 1 computer, as well as many other types of systems. ENIAC 1 is probably one of the most interesting, as it requires 20,000 vacuum tubes to operate. It was a huge machine, and it started a revolution to make computers smaller and faster.

The computer age was changed forever by the introduction of International Business Machines, or IBM, in the computing industry in 1953. Throughout computer history, this company has been a major player in the development of new systems and servers for the public. And personal use. This introduction brought the first real signs of competition within computing history, leading to faster and better development of computers. His first contribution was the IBM 701 EDPM computer.

Development of programming language

A year later, the first successful high-level programming language was created. It was a programming language not written in ‘assembly’ or binary, which is considered a very low-level language. FORTRAN was written so that more and more people could start programming computers easily.

In 1955, Bank of America teamed up with Stanford Research Institute and General Electric to build the first computers for use in banks. MICR, or Magnetic Ink Character Recognition, along with the actual computer, ERMA, was a breakthrough for the banking industry. It was not until 1959 that the pairing system was put into use in actual banks.

In 1958, one of the most important breakthroughs in computer history was the creation of the integrated circuit. This device, also known as a chip, is now one of the basic requirements for modern computer systems. On each motherboard and card within a computer system, there are several chips that contain information about what the board and card do. Without these chips, the system as we know them today could not function.

Gaming, Mice and the Internet

For many computer users, games are an important part of the computing experience. In 1962 the first computer game ‘Spacewar’ was created by Steve Russell and MIT.

One of the most basic components of a modern computer, the mouse, was created in 1964 by Douglas Engelbart. It derived its name from the “tail” emanating from the device.

One of the most important aspects of the computer today was invented in 1969. The ARPA net was the original Internet, which provided the foundation for the Internet as we know it today. This development will result in the growth of knowledge and business across the planet.

It wasn’t until 1970 that Intel entered the scene with the first dynamic RAM chip, resulting in an explosion of computer science innovation.

The first microprocessor was on the heels of the RAM chip, also designed by Intel. Apart from the chip developed in 1958, these two components would form the core components of modern computers.

A year later, the floppy disk was created, which derives its name from the flexibility of the storage unit. This was the first step in allowing most people to transfer bits of data between unconnected computers.

The first networking cards were made in 1973, allowing data transfer between connected computers. It is similar to the Internet but allows computers to connect without the use of the Internet.

The emergence of home PCs

The next three years were very important for computers. This was when companies started developing systems for the average consumer. The Scelbi, Mark-8 Altair, IBM 5100, Apple I and II, TRS-80, and Commodore pet computers were pioneers in this area. Along with being expensive, these machines started the trend of computers in common homes.

One of the most prominent change in computer software occurred in 1978 with the release of the VisiCalc spreadsheet program. All development costs were paid off within a two-week period, making it one of the most successful programs in computer history.

wordstar

The IBM home computer helped revolutionize the consumer market rapidly in 1981, as it was affordable for homeowners and standard consumers. In 1981 also the mega-giant Microsoft entered the scene with the MS-DOS operating system. This operating system completely changed computing forever, as it was easy enough for everyone to learn.

The Competition Begins Apple vs. Microsoft

During the year 1983, computers saw another significant change. The Apple Lisa computer was the first with a graphical user interface or GUI. Most modern programs have a GUI, which allows them to be easy to use and pleasant to the eye. This marked the beginning of our dating most text-based only programs .

Beyond this point in computer history, there have been many changes and changes, from the Apple-Microsoft wars to the development of microcomputers and the variety of computer breakthroughs that have become an accepted part of our daily lives. Without the very early first stages of computer history, none of this would have been possible.

Table of Contents

Essay 2 200 words

Early computer.

The history of computers dates back to the early 1900s; in fact, computers have been around for more than 5000 years.

In ancient times a “computer” (or “computer”) was a person who performed numerical calculations under the direction of a mathematician.

Some of the better-known tools used are the abacus or the Antikythera Tantra.

Around 1725, Basil Bouchon took paper punched in a loom to set the pattern to be reproduced on the cloth. This ensured that the pattern was always the same and there was hardly any human error.

Later, in 1801, Joseph Jacquard (1752 – 1834) used the punch card idea to automate more devices with great success.

Pollution Essay in Hindi/प्रदूषण पर लघु निबंध

First computer?

Charles Babbage. (1792–1871), was ahead of his time, and using the punch card idea, he developed the first computing devices that suffices scientific purposes. He invented Charles Babbage’s differential engine, which he started in 1823 but never completed. He later began work on the Analytical Engine, which was designed in 1842.

The credit for inventing computing concepts goes to Babbage because of his findings such as conditional branches, iterative loops, and index variables.

Ada Lovelace (1815–1852), a collaborator of Babbage and the founder of scientific computing.

Babbage’s inventions were greatly improved upon, with George Scheutz working on a smaller version with his son Edward Scheutz, and by 1853 he had built a machine that could process 15-digit numbers and fourths. Could calculate the difference of the sequence.

Among the first notable commercial use (and success) of computers was the US Census Bureau, which used a punch-card device designed by Herman Hollerith to tabulate data for the 1890 census.

To compensate for the cyclical nature of the Census Bureau’s demand for its machines, Hollerith founded the Tabulating Machine Company (1896), one of three companies that merged to form IBM in 1911.

Use of digital electronics in computers

Later, Claude Shannon (1916–2001) first suggested the use of digital electronics in computers, and in 1937 and J.V. Atanasoff built the first electronic computer that could solve 29 equations simultaneously with 29 unknowns. But this device was not programmable.

During that crisis, the development of computers was rapid. But many projects remained secret until much later due to restrictions, and a notable example is the British Army “Colossus” developed by Alan Turing and his team in 1943.

In the late 1940s, the US Army commissioned John V. Mauchly to develop a device to calculate ballistics during World War II. As it turned out, the machine was only produced in 1945, but the Electronic Numerical Integrator and Computer, or ENIAC, proved to be a turning point in computer history.

The ENIAC proved to be a very efficient machine but not very easy to operate. Any change sometimes requires reprogramming the device. Engineers were aware of this obvious problem and developed a “stored program architecture.”

John von Neumann (a consultant to ENIAC), Mauchly, and his team developed EDVAC, this new project using stored programs.

Eckert and Mauchly later developed what was arguably the first commercially successful computer, the UNIVAC.

Software technology was very primitive during this period. The first programs were written in machine code. By the 1950s, programmers were using a symbolic notation known as assembly language, then translating the symbolic notation into machine code by hand. The programs later known as assemblers did the translation work.

The end of the transistor era, inventor.

The late 1950s saw the end of valve-operated computers. Transistor-based computers were  were smaller, cheaper, faster, and much more reliable.

Corporations were now building new computers instead of inventors.

Some of the better known are:

TRADIC at Bell Laboratories in 1954,

TX-0 at MIT’s Lincoln Laboratory

The IBM 704 and its successors, the 709 and 7094. The latter introduced I/O processors for better throughput between I/O devices and main memory.

The first food computers, the Livermore Atomic Research Computer (LARC) and the IBM 7030 (aka Stretch)

Texas Instrument Advanced Scientific Computer (TI-ASC)

Now that was the basis of computers, computers with transistors were faster, and with stored-program architecture, you could use computers for almost anything.

New higher-level programs soon followed, FORTRAN (1956), ALGOL (1958), and COBOL (1959), with Cambridge and the University of London collaborating in the development of the CPL (Combined Programming Language, 1963). Martin Richards of Cambridge developed a subset of CPL called BCPL (Basic Computer Programming Language, 1967).

1969’s latest release, the CDC 7600, could perform 10 million floating-point operations (10 Mflops) per second.

Network year.

Since 1985, there was a competition to install more and more transistors on a computer. Each of them could perform a simple operation. But computers haven’t evolved much other than being faster and capable of doing more operations.

The concept of parallel processing has been more widely used since the 1990s.

In the field of computer networking, both wide area network (WAN) and local area network (LAN) technology developed rapidly.

Ref: myodopc

Related posts:

Default Thumbnail

Leave a Comment

{{#message}}{{{message}}}{{/message}}{{^message}}Your submission failed. The server responded with {{status_text}} (code {{status_code}}). Please contact the developer of this form processor to improve this message. Learn More {{/message}}

{{#message}}{{{message}}}{{/message}}{{^message}}It appears your submission was successful. Even though the server responded OK, it is possible the submission was not processed. Please contact the developer of this form processor to improve this message. Learn More {{/message}}

Submitting…

  • Undergraduate
  • High School
  • Architecture
  • American History
  • Asian History
  • Antique Literature
  • American Literature
  • Asian Literature
  • Classic English Literature
  • World Literature
  • Creative Writing
  • Linguistics
  • Criminal Justice
  • Legal Issues
  • Anthropology
  • Archaeology
  • Political Science
  • World Affairs
  • African-American Studies
  • East European Studies
  • Latin-American Studies
  • Native-American Studies
  • West European Studies
  • Family and Consumer Science
  • Social Issues
  • Women and Gender Studies
  • Social Work
  • Natural Sciences
  • Pharmacology
  • Earth science
  • Agriculture
  • Agricultural Studies
  • Computer Science
  • IT Management
  • Mathematics
  • Investments
  • Engineering and Technology
  • Engineering
  • Aeronautics
  • Medicine and Health
  • Alternative Medicine
  • Communications and Media
  • Advertising
  • Communication Strategies
  • Public Relations
  • Educational Theories
  • Teacher's Career
  • Chicago/Turabian
  • Company Analysis
  • Education Theories
  • Shakespeare
  • Canadian Studies
  • Food Safety
  • Relation of Global Warming and Extreme Weather Condition
  • Movie Review
  • Admission Essay
  • Annotated Bibliography
  • Application Essay
  • Article Critique
  • Article Review
  • Article Writing
  • Book Review
  • Business Plan
  • Business Proposal
  • Capstone Project
  • Cover Letter
  • Creative Essay
  • Dissertation
  • Dissertation - Abstract
  • Dissertation - Conclusion
  • Dissertation - Discussion
  • Dissertation - Hypothesis
  • Dissertation - Introduction
  • Dissertation - Literature
  • Dissertation - Methodology
  • Dissertation - Results
  • GCSE Coursework
  • Grant Proposal
  • Marketing Plan
  • Multiple Choice Quiz
  • Personal Statement
  • Power Point Presentation
  • Power Point Presentation With Speaker Notes
  • Questionnaire
  • Reaction Paper
  • Research Paper
  • Research Proposal
  • SWOT analysis
  • Thesis Paper
  • Online Quiz
  • Literature Review
  • Movie Analysis
  • Statistics problem
  • Math Problem
  • All papers examples
  • How It Works
  • Money Back Policy
  • Terms of Use
  • Privacy Policy
  • We Are Hiring

History of Computers, Essay Example

Pages: 6

Words: 1679

Hire a Writer for Custom Essay

Use 10% Off Discount: "custom10" in 1 Click 👇

You are free to use it as an inspiration or a source for your own work.

Introduction

Computers have become an important part of social development from one human generation to another. Unlike other forms of technology, the discovery and development of computers cannot be accounted from one person alone. Relatively, the distinct course of progressive evolution that computers have been best known for could be noted to have been a contribution coming from collaborative inventors, programmers and other enthusiasts who have been in constant desire to improve what has already been seen and discovered through time. With such knowledge in mind, it is safe to say that the history of computers is defined through the collaborative engagement of inventors and explorers of modern technology as they try to be more confined on what is assumed as directive development.

Through time, it could be observed [as recorded in history] that the evolution of computers is highly dependent on the distinct desire of humans to be acquainted with innovations that allow them to become more efficient in the works that they are expected to complete. Notably, with such a desire, the evolutionary development of computers has been given strong attention to. What is noted as a definite course progress in technology notes the capacity of computers to lead the current society towards a much more progressive system of development in the near future. In the discussion that follows, particular highlights that define the history of computers from the early 1930s towards the present shall be given specific attention to.

The First Generation of Computers (1936-1948)

From the abacus to the distinct creation of a complex machine that is able to compute basic mathematical operations, the creation of a unit that would allow humans to create commands and direct particular machines to work according to the demands established under particular industries, the era of the 1930s paved the way towards opening the doors into introducing what free computer programming is about. Under the leadership of Konrad Zeus, the Z1 computer has been produced. This computer features a free-programmable system that allows human individuals to become more acquainted to the open options that computer systems offer them with [especially when it comes to customizing the functions of a computer system into completing tasks that they are expected to accomplish]. This type of computer generation is often programmed to accomplish basic mathematical operations and formulas which have most often than not been applied to create a definite pattern of function in machines. It is because of this application that the number of machine-operators has decreased accordingly during the said years.

The Second Generation Computers (1951-1958)

While the first generation computers were made to specifically respond to single to three commands at a time, the complexity of the new second generation of computers opened the doors towards welcoming a more complex form of programming. In 1951, the release of first UNIVAC computer was given way under the leadership of John Presper Eckert and John Mauchly. This type of computer is already able to take on mass-operations [eg. Election: picking of new presidents based on social vote-counts]. The complexity of the programs that could be encoded in the system of these types of computers allowed for the presentation of a new sense of understanding the role of computers in the human society. Notably, it could be understood that this is where the release of first commercial computers have been given way.

These commercial computers allowed the new generation of computer-users to become more effective in determining the progressive function that these innovative technologies are supposed to provide them with. It is with the desire of the inventors and developers to make the best use of what benefits they could get from computers that the creation of such possibilities came into mind. It is also during this phase of development in computer-technology that organizations and computer developers such as IBM and FORTRAN came into being. These big names in the industry were the pioneers in making properly established computer systems that cater to the needs of the modern society as they embrace the new sense of understanding what computer-technology has to offer the public with.

Gaming and Computer-User Interaction (1962-1981)

During the introductionof the 1960s, the creation of highly interactive computers has been given way. Known as personal computers, this generation of new age computing machines could already be owned and utilized by individual users. Releasing the first microprocessor in 1971, the possibilities of creating hand held computers have been given birth. Nevertheless, such exploration required first hand direction on what personal computers are to be for. The creation of effective data storage systems has also been given way.

Allowing personal users of computer to be more effective in their sense of computer utilization, these personal sets of computers gives a sense of satisfaction especially when it comes to individual demands. Used for their office functions and personal operations, these set of basic personal computers provide a sense of usage-ease among the first owners of the said forms of technology. Best for basic business operations, the creation of the first programmable VisiCalc Spreadsheet Software has been introduced for public use. This allowed users to create calculations and formulations turning them into automatic operations that make the tasks easier to bear. WordStar Software was also introduced by Seymour Rubenstein and Rob Barnaby. This software made office works easier and served as the pioneer base for the creation of other more advanced forms of word processors.

In 1981, the presentation of the new revolutionary platform for programming has been given way in the form of MS DOS Computer Operating System. Relatively, this quick and dirty form of operating system paved the way towards determining a more directive course of development in programming that paves the way to more complex functions of the computer. It is from this form of computing program that the creation of more complex form of personal computers.

The Birth of New Age Computers (1983-1985)

From the new programming system comes the new-age form of computer setups that revolutionized the way computers are used at home. The first home computer with a GUI [Graphics User Interface] has been brought to life. Under the leadership of APPLE Inc., this computer provides home owners a much better access to the new age technology that tends to ease out their computing operations. From this point, new computer sets have been created to fit the home-owners’ demands accordingly. Such option of revolution in computer-distribution has paved the way to new discoveries that gave new age computers a better sense of reputation that caters to the basic needs of general users in the community.

Computer Revolution and Progress (1990-Present)

Since the early revolution of programming, it could be realized how practically developmental computer evolution has become. What made such progress possible is the distinct desire of humans to engage in something new and something innovative at all times. Reality suggests that new age technology does impose a sense of development especially when it comes to make a definite impact on how the human society functions accordingly. Computer revolution allowed for the chance by which people become more connected to how these machines of wonders work towards the development of new options of user-interface function that actually identifies well with what is assumed as modern computing options.

Looking Into the Future

The future of computers continue to open different possibilities of advancement that is expected to ease out the process by which humans complete the tasks that they are expected to accomplish. May it be industry based, office base, home based or other operations that are designed to give attention to the general functions of the society, computers are expected to shape the general operations of the human community as it embraces the new opportunities for the future. Computers are gradually becoming the main elements that define human development alongside the desire of developing social progress.

Overtime, computer operations in the modern society has allowed a sense of development especially on how humans function according to the tasks that they are supposed to accomplish. The future of computers will continue to be based on such function. Notably, it is through this distinctive value that computer technology and other skills related to developing its advancement are considered as a sense of investment especially among those who have distinct interest on how computers work and how they develop through time according to the demands of the community.

Computers have and will always be a basic factor that serves as the foundation of social advancement. Under the determination of a modern world to make amends to how humans live their lives, it is expected that computers will continue to take on center stage as they become the primary elements that are dedicated towards assuming the best option of growth that humans are likely to be well related to. Notably, the way people become strongly acquainted to this technology defines how well computers are shaping the culture and the living system of the modern human society.

In relation to this particular discussion, the history of computer-development does provide an efficient sense of improvement on how humans intend to embrace social progress in consideration with the determination of good computer operation application. The practical measures of advancement that the computers take into account at present does improve the manner by which humans function especially in consideration with the basic functions they are supposed to undergo. It is through this that the fast paced life-culture of the new social system gets supported through the emergence of new technological applications that are designed to redefine the pattern of living that humans take into account as they embrace modernity accordingly.

Works Cited

Copeland, Jack (2006), Colossus: The Secrets of Bletchley Park’s Codebreaking Computers , Oxford: Oxford University Press, pp. 101–115.

Fuegi, J. and Francis, J. “ Lovelace & Babbage and the creation of the 1843 ‘notes’ “. IEEE Annals of the History of Computing 25 No. 4 (October–December 2003): Digital Object Identifier.

Kempf, Karl (1961). “ Historical Monograph: Electronic Computers Within the Ordnance Corps “. Aberdeen Proving Ground (United States Army).

Phillips, Tony (2000). “ The Antikythera Mechanism I”. American Mathematical Society.

Essinger, James (2004). Jacquard’s Web, How a hand loom led to the birth of the information age. Oxford University Press.

Stuck with your Essay?

Get in touch with one of our experts for instant help!

The Sociology of Health and Healing, Coursework Example

Early Language Development, Essay Example

Time is precious

don’t waste it!

Plagiarism-free guarantee

Privacy guarantee

Secure checkout

Money back guarantee

E-book

Related Essay Samples & Examples

Voting as a civic responsibility, essay example.

Pages: 1

Words: 287

Utilitarianism and Its Applications, Essay Example

Words: 356

The Age-Related Changes of the Older Person, Essay Example

Pages: 2

Words: 448

The Problems ESOL Teachers Face, Essay Example

Pages: 8

Words: 2293

Should English Be the Primary Language? Essay Example

Pages: 4

Words: 999

The Term “Social Construction of Reality”, Essay Example

Words: 371

Talk to our experts

1800-120-456-456

  • Essay on Computer

ffImage

Long and Short Computer Essay

The term computer was once used to refer to a person who did computation, unlike today. The development of early prototypes that led to the modern computer is credited to many individuals throughout history. A series of breakthroughs, beginning with transistor computers and then integrated circuit computers, resulted in the development of transistor technology and the integrated circuit chip, causing digital computers to largely replace analogue computers. 

In this essay, we will discuss the various components and types of computers and talk about their uses in various fields.

Long Computer Essay in English

A computer is an electronic tool that manipulates data or information. It can store, retrieve, and process information. We can type documents, send emails, play games, and browse the Web using a computer. It can also be used to edit spreadsheets, presentations, and even videos, or create them. 

Early computers were conceived only as devices for calculating. Simple manual devices such as the abacus have helped individuals do calculations since ancient times. Some mechanical devices were built early in the Industrial Revolution to automate long, tedious tasks, such as guiding patterns for looms. In the early 20th century, more sophisticated electrical machines performed specialized analogue calculations. 

Common Components of Computers

All those parts of a computer that are tangible physical objects are covered under the term hardware. The hardware includes circuits, computer chips, graphics cards, sound cards, memory (RAM), motherboards, displays, power supplies, cables, keyboards, printers and "mice" input devices.

 There are five main hardware components: 

Input Devices: 

These are devices that are used to enter data/information in the central processing unit. Example- keyboard, mouse, scanner, document reader, barcode reader, optical character reader, magnetic reader etc.

Output Devices: 

These are devices that provide the processed data/information into human-readable form. Example- monitor, printer, speaker, projector etc.

Control Unit: 

The control unit handles the various components of the computer; it reads and interprets (decodes) the instructions for the program, transforming them into control signals that activate other computer parts.

Arithmetic Logic Unit: 

It is capable of performing arithmetical and logical functions. The set of arithmetic operations supported by a specific ALU may be restricted to addition and subtraction or may include functions of multiplication, division, trigonometries such as sine, cosine, etc., and square roots.

Central Processing Unit: 

The ALU, control unit and registers and together called the CPU. It is sometimes called the computer's brain, and its job is to perform commands. We send instructions to the CPU whenever we press a key, click the mouse, or start an application.

Software refers to computer parts, such as programs, data, protocols, etc., that do not have a material form. In contrast to the physical hardware from which the system is built, the software is that portion of a computer system consisting of encoded information or computer instructions.

It is sometimes called "firmware" when the software is stored in hardware that can not be easily modified, such as with a BIOS ROM on an IBM PC compatible computer.

Computer hardware and software require each other, and neither of them can be realistically used on their own. There are four main components of a general-purpose computer: the arithmetic logic unit (ALU), the control unit, the memory, and the I/O (collectively called input and output) devices.

Uses of Computer

Computers are used in various fields, such as homes, businesses, government offices, research organizations, educational institutions, medicine, entertainment, etc. because of their features and powerful functions. They have taken sectors and companies to a whole new level.

Science- 

Computers are best suited for the collection, analysis, categorization, and storage of data in science, research and engineering. They also help scientists to exchange data both internally and internationally with each other.

Government-  

Computers in the government sector are used to perform various functions and improve their services. In most cases, data processing tasks, the maintenance of citizens' databases, and the promotion of a paperless environment are the primary purposes of using computers. In addition to this, computers play a key role in the country's defence system.

Health and Medicine- 

They are used to preserve information, records, live patient monitoring, X-rays, and more from patients. Computers assist in setting up laboratory tools, monitoring heart rate and blood pressure, etc. Besides, computers allow physicians to easily exchange patient data with other medical specialists.

Education- 

They help people get different educational materials (such as images, videos, e-books, etc.) in one place. Also, computers are best suited for online classes, online tutoring, online exams, and task and project creation. Also, they can be used to maintain and track student performance and other data.

Banking- 

Most countries use online banking systems so that customers can access their data directly. People can verify the balance of their account, transfer cash, and pay online bills, including credit cards. Besides, banks use computers to execute transactions and store client information, transaction records, etc.

Short Computer Essay in English

A computer's a programmable device that accepts raw data(input) and processes it as output with a group of instructions (a program) to supply the result. It renders output after performing mathematical and logical operations and can save the output for future use. The word "computer" derives from the word "computare" in Latin, which means calculating.

Types of Computer

Computers are of different types based on different criteria. Based on their size, computers are of five types:

Micro Computers- 

It is a single-user computer that has less capacity for speed and storage than the other types. For a CPU, it uses a microprocessor. Laptops, desktop computers, personal digital assistants (PDAs), tablets, and smartphones are common examples of microcomputers. Microcomputers are generally designed and built for general use, such as browsing, information search, the internet, MS Office, social media, etc.

Mini Computers- 

Minicomputers are also referred to as "Midrange Computers." They are multi-user computers designed to simultaneously support multiple users. Therefore, they are generally used by small companies and firms. 

Mainframe Computers- 

It is also a multi-user computer that large companies and government organizations use to run their business operations as large amounts of data can be stored and processed. Banks, universities, and insurance companies, for example, use mainframe computers to store data from their customers, students, and policyholders.

Super Computer- 

Among all types of computers, supercomputers are the fastest and most costly computers. They have an enormous capacity for storage and computing speeds and can therefore perform millions of instructions per second.

Workstations-  

It is a single-user computer with a comparatively more powerful microprocessor and a high-quality monitor compared to a mini-computer.

Benefits of Computers:

It increases productivity.

It helps in connecting to the internet.

It helps in organizing data and information.

It allows storing large amounts of data.

Fun Facts About Computers

The first electric computer that was invented weighed around 27 tons or even more than that and took up to 1800 square feet.

There are about 5000 new viruses that are released every month.

The original name of Windows was Interface Manager.

It is surely known that the life of humans would not have been so easy if computers were not a part of human life. This is also supported by a lot of pieces of evidence where we can even see in daily life how the computer is not just present in an organization but is also available right in the pockets of everyone. Thus, the computer has surely made it easy while also spoiling a lot of people's lives. 

arrow-right

FAQs on Essay on Computer

1. What are the disadvantages of computers?

While the computer has surely made life easier, it also has a lot of disadvantages. The disadvantages of the computers can be provided as follows:

People spend too much time sitting and doing nothing but watching the content on computers.

People staring at computers for a long time also tend to strain their eyes, and as a result, they need spectacles to understand what is being written in front of them.

Attention span is decreasing with an increase in the use of computers. 

With computers being AI-powered, it is now easier for people to do all the tasks on a computer and not work on it themselves. This has made a lot of people lazy.

2. What is the process of working on a computer?

A computer is an electronic machine and it needs information to be added in as raw data to function well. It has a flow that determines the accessing of data. The following steps take place before the results are obtained:

Information is taken in by the computer in the form of raw data. This process is also called the input.

Then the information that is not needed will be stored while the information that is needed is passed onto the next step. The storing of data is called memory.

Then the information that is required is crushed or it is split and this process is called processing.

The last step is where the results are obtained. This process is called getting the output.

Essay on Computer and its Uses for School Students and Children

500+ words essay on computer.

In this essay on computer, we are going to discuss some useful things about computers. The modern-day computer has become an important part of our daily life. Also, their usage has increased much fold during the last decade. Nowadays, they use the computer in every office whether private or government. Mankind is using computers for over many decades now. Also, they are used in many fields like agriculture, designing, machinery making, defense and many more. Above all, they have revolutionized the whole world.

essay on computer

History of Computers

It is very difficult to find the exact origin of computers. But according to some experts computer exists at the time of world war-II. Also, at that time they were used for keeping data. But, it was for only government use and not for public use. Above all, in the beginning, the computer was a very large and heavy machine.

Working of a Computer 

The computer runs on a three-step cycle namely input, process, and output. Also, the computer follows this cycle in every process it was asked to do. In simple words, the process can be explained in this way. The data which we feed into the computer is input, the work CPU do is process and the result which the computer give is output.

Components and Types of Computer

The simple computer basically consists of CPU, monitor, mouse, and keyboard . Also, there are hundreds of other computer parts that can be attached to it. These other parts include a printer, laser pen, scanner , etc.

The computer is categorized into many different types like supercomputers, mainframes, personal computers (desktop), PDAs, laptop, etc. The mobile phone is also a type of computer because it fulfills all the criteria of being a computer.

Get the huge list of more than 500 Essay Topics and Ideas

Uses of Computer in Various Fields

As the usage of computer increased it became a necessity for almost every field to use computers for their operations. Also, they have made working and sorting things easier. Below we are mentioning some of the important fields that use a computer in their daily operation.

Medical Field

They use computers to diagnose diseases, run tests and for finding the cure for deadly diseases . Also, they are able to find a cure for many diseases because of computers.

Whether it’s scientific research, space research or any social research computers help in all of them. Also, due to them, we are able to keep a check on the environment , space, and society. Space research helped us to explore the galaxies. While scientific research has helped us to locate resources and various other useful resources from the earth.

For any country, his defence is most important for the safety and security of its people. Also, computer in this field helps the country’s security agencies to detect a threat which can be harmful in the future. Above all the defense industry use them to keep surveillance on our enemy.

Threats from a Computer

Computers have become a necessity also, they have become a threat too. This is due to hackers who steal your private data and leak them on internet. Also, anyone can access this data. Apart from that, there are other threats like viruses, spams, bug and many other problems.

essay on computer history

The computer is a very important machine that has become a useful part of our life. Also, the computers have twin-faces on one side it’s a boon and on the other side, it’s a bane. Its uses completely depend upon you. Apart from that, a day in the future will come when human civilization won’t be able to survive without computers as we depend on them too much. Till now it is a great discovery of mankind that has helped in saving thousands and millions of lives.

Frequently Asked Questions on Computer

Q.1  What is a computer?

A.1 A computer is an electronic device or machine that makes our work easier. Also, they help us in many ways.

Q.2 Mention various fields where computers are used?

A.2  Computers are majorly used in defense, medicine, and for research purposes.

Customize your course in 30 seconds

Which class are you in.

tutor

  • Travelling Essay
  • Picnic Essay
  • Our Country Essay
  • My Parents Essay
  • Essay on Favourite Personality
  • Essay on Memorable Day of My Life
  • Essay on Knowledge is Power
  • Essay on Gurpurab
  • Essay on My Favourite Season
  • Essay on Types of Sports

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Download the App

Google Play

Essay on Computer

500+ words essay on computer.

A computer is an electronic device that performs complex calculations. It is a wonderful product of modern technology. Nowadays, computers have become a significant part of our life. Whether it is in the sector of education or health, computers are used everywhere. Our progress is entirely dependent on computers powered by the latest technology. This ‘Essay on Computer’ also covers the history of computers as well as their uses in different sectors. By going through the ‘Computer’ Essay in English, students will get an idea of writing a good Essay on Computers. After practising this essay, they will be able to write essays on other topics related to computers, such as the ‘Uses of Computer’ Essay.

The invention of the computer has made our lives easier. The device is used for many purposes, such as securing information, messages, data processing, software programming, calculations, etc. A desktop computer has a CPU, UPS, monitor, keyboard, and mouse to work. A laptop is a modern form of computer in which all the components are inbuilt into a single device. Earlier, computers were not so fast and powerful. After thorough and meticulous research and work by various scientists, modern-day computers have come up.

History of Computers

The history of computer development is often used to reference the different generations of computing devices. Each generation of computers is characterised by a major technological development that fundamentally changed the way computers work. Most of the major developments from the 1940s to the present day have resulted in increasingly smaller, more powerful, faster, cheaper and more efficient computing devices.

The evolution of computer technology is often divided into five generations. These five generations of computers are as follows:

Uses of Computers

Computers are used in various fields. Some of the applications are

1. Business

A computer can perform a high-speed calculation more efficiently and accurately, due to which it is used in all business organisations. In business, computers are used for:

  • Payroll calculations
  • Sales analysis
  • Maintenance of stocks
  • Managing employee databases

2. Education

Computers are very useful in the education system. Especially now, during the COVID time, online education has become the need of the hour. There are miscellaneous ways through which an institution can use computers to educate students.

3. Health Care

Computers have become an important part of hospitals, labs and dispensaries. They are used for the scanning and diagnosis of different diseases. Computerised machines do scans, which include ECG, EEG, ultrasound and CT Scan, etc. Moreover, they are used in hospitals to keep records of patients and medicines.

Computers are largely used in defence. The military employs computerised control systems, modern tanks, missiles, weapons, etc. It uses computers for communication, operation and planning, smart weapons, etc.

5. Government

Computers play an important role in government services. Some major fields are:

  • Computation of male/female ratio
  • Computerisation of PAN card
  • Income Tax Department
  • Weather forecasting
  • Computerisation of voters’ lists
  • Sales Tax Department

6. Communication

Communication is a way to convey an idea, a message, a picture, a speech or any form of text, audio or video clip. Computers are capable of doing so. Through computers, we can send an email, chat with each other, do video conferencing, etc.

Nowadays, to a large extent, banking is dependent on computers. Banks provide an online accounting facility, which includes checking current balances, making deposits and overdrafts, checking interest charges, shares, trustee records, etc. The ATM machines, which are fully automated, use computers, making it easier for customers to deal with banking transactions.

8. Marketing

In marketing, computers are mainly used for advertising and home shopping.

Similarly, there are various other applications of computers in other fields, such as insurance, engineering, design, etc.

Students can practise more essays on different topics to improve their writing skills. Keep learning and stay tuned with BYJU’S for the latest update on CBSE/ICSE/State Board/Competitive Exams. Also, download the BYJU’S App for interactive study videos.

Frequently asked Questions on Computer Essay

How has the invention of the computer been useful to students.

Easy and ready access to information has been possible (internet) with the invention of the computer.

How to start writing an essay on a computer?

Before writing an essay, first plan the topics, sub-topics and main points which are going to be included in the body of the essay. Then, structure the content accordingly and check for information and examples.

How to use the computer to browse for information on essays?

Various search engines are available, like Google, where plenty of information can be obtained regarding essays and essay structures.

Leave a Comment Cancel reply

Your Mobile number and Email id will not be published. Required fields are marked *

Request OTP on Voice Call

Post My Comment

essay on computer history

Thank u sir

essay on computer history

  • Share Share

Register with BYJU'S & Download Free PDFs

Register with byju's & watch live videos.

close

Counselling

Help | Advanced Search

Computer Science > Computer Vision and Pattern Recognition

Title: unsolvable problem detection: evaluating trustworthiness of vision language models.

Abstract: This paper introduces a novel and significant challenge for Vision Language Models (VLMs), termed Unsolvable Problem Detection (UPD). UPD examines the VLM's ability to withhold answers when faced with unsolvable problems in the context of Visual Question Answering (VQA) tasks. UPD encompasses three distinct settings: Absent Answer Detection (AAD), Incompatible Answer Set Detection (IASD), and Incompatible Visual Question Detection (IVQD). To deeply investigate the UPD problem, extensive experiments indicate that most VLMs, including GPT-4V and LLaVA-Next-34B, struggle with our benchmarks to varying extents, highlighting significant room for the improvements. To address UPD, we explore both training-free and training-based solutions, offering new insights into their effectiveness and limitations. We hope our insights, together with future efforts within the proposed UPD settings, will enhance the broader understanding and development of more practical and reliable VLMs.

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. History of Computer Free Essay Example

    essay on computer history

  2. (PDF) Essay on the understanding of computer & systems sciences

    essay on computer history

  3. Computer Hardware and Software Essay Example

    essay on computer history

  4. Essay On Computer For Student In Easy Words

    essay on computer history

  5. History Of Computer In English Essay

    essay on computer history

  6. The Computer Essay Introduction

    essay on computer history

VIDEO

  1. கணினி பற்றிய கட்டுரை தமிழில்

  2. 10 Lines Essay On Computer In Hindi/Essay Writing On Computer/Computer Short Essay

  3. Essay Computer

  4. Importance Of Computer I Essay Writing I essay

  5. 10 lines on Monitor essay in English

  6. How Did the History of Computers Shape Today's World?

COMMENTS

  1. Computer

    The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage's invention of the first computer. Computer - History, Technology, Innovation: A computer might be described with deceptive simplicity as "an apparatus that performs routine calculations automatically.".

  2. Essay on History of Computer

    500 Words Essay on History of Computer The Dawn of Computing. The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is ...

  3. History of computing

    In his Essays on Automatics (1914) Torres presented the design of an electromechanical calculating machine and introduced the idea of Floating-point arithmetic. ... Stephen White's excellent computer history site (the above article is a modified version of his work, used with permission)

  4. History of computers: A brief timeline

    1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts ...

  5. Histories of Computing

    Computer technology is pervasive in the modern world, its role ever more important as it becomes embedded in a myriad of physical systems and disciplinary ways of thinking. The late Michael Sean Mahoney was a pioneer scholar of the history of computing, one of the first established historians of science to take seriously the challenges and opportunities posed by information technology to our ...

  6. Essay about The History of Computers

    Good Essays. 1316 Words. 6 Pages. Open Document. The first ever computer was invented in the 1820s by Charlse Babbage. However the first electronic digital computer were developed between 1940 and 1945 in the United States and in the United Kingdom. They were gigantic, originally the size of a large room, and also need to be supply a large ...

  7. Essay on Computer

    History. Computer's history can trace back to the Abacus, a calculating tool of ancient times, which eventually gave way to the development of the computer. The very first mechanical computer was developed in the 1820s by Charles Baggage, credited as the Father of the "Modern Computer."" ... Thus, we saw an essay on computer, it is a ...

  8. History of Computers

    This first computer was created in 1822 by Charles Babbage. This computer was created with a series of vacuum tubes and weighed a total of 700 pounds, which is much larger than the computers we see today. For example, most laptops weigh in a range of two to eight pounds. A picture of one of the first computers can be seen below in figure 1.

  9. Computers: The History of Invention and Development Essay

    The first computers occupied almost the whole room and were very slow in processing data and performance in general. The modern world witnesses the development of computer technologies daily with computers turning into tiny machines and working unbelievably smoothly. A computer is now trusted as a best friend and advisor.

  10. The Modern History of Computing

    The Modern History of Computing. First published Mon Dec 18, 2000; substantive revision Fri Jun 9, 2006. Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in ...

  11. History of Computers and Their Evolvement

    Our life truly has turned out to be substantially less demanding. Today gadgets are everywhere with us: at home, in the workplace, in instructive establishments. PCs opened up a universe of new doors previously us and they made life more agreeable, spared our chance. Everything began a great many years back when computations were achieved ...

  12. The History of Computers: An Essay

    Jack Kilby, an engineer of Texas Instruments developed the IC in 1957 and by 1964 it was in use. The development of the IC was significant because it pooled the role of the transistors and other circuits into a chip. These new IC chips were one sixteenth of a square inch and a few hundredths of an inch thick.

  13. The History Of Computing: [Essay Example], 392 words

    The history of tackling these issues is the history of computing ( Freiberger ). There have been a number of computing milestones and it has evolved in many ways over the years. The earliest form of the computer dates back to the 14 th century and it was known as the "Abacus". It is an instrument used for calculations by sliding counters ...

  14. Essay on History of computers/Evolution of Computers

    Essay 2 200 words. early computer. The history of computers dates back to the early 1900s; in fact, computers have been around for more than 5000 years. In ancient times a "computer" (or "computer") was a person who performed numerical calculations under the direction of a mathematician.

  15. History of Computers, Essay Example

    It is from this form of computing program that the creation of more complex form of personal computers. The Birth of New Age Computers (1983-1985) From the new programming system comes the new-age form of computer setups that revolutionized the way computers are used at home. The first home computer with a GUI [Graphics User Interface] has been ...

  16. Essay about History of the Computer

    The history of computer hardware covers the developments from simple devices to aid calculation, to mechanical calculators, punched card data processing and on to modern stored program computers. The tools or mechanical tool used to help in calculation are called calculators while the machine operator that help in calculations is called computer.

  17. History Of Computers Essay

    Brief History Of Computers Essay. The history of computers is a long and fascinating one. The computer was initially born out of necessity, not just for entertainment, which is more or less how much people utilize computers these days. In fact, computers were born out of a need to solve a serious number-crunching crisis.

  18. Essay on Computer for Students in English

    Short Computer Essay in English. A computer's a programmable device that accepts raw data (input) and processes it as output with a group of instructions (a program) to supply the result. It renders output after performing mathematical and logical operations and can save the output for future use.

  19. Essay on Computer and its Uses in 500 Words for Students

    500+ Words Essay on Computer. In this essay on computer, we are going to discuss some useful things about computers. The modern-day computer has become an important part of our daily life. Also, their usage has increased much fold during the last decade. Nowadays, they use the computer in every office whether private or government.

  20. Essay on Computer For Students In English

    This 'Essay on Computer' also covers the history of computers as well as their uses in different sectors. By going through the 'Computer' Essay in English, students will get an idea of writing a good Essay on Computers. After practising this essay, they will be able to write essays on other topics related to computers, such as the ...

  21. Computer History Essay

    Computer History Essay. Improved Essays. 1014 Words; 4 Pages; Open Document. Essay Sample Check Writing Quality. Show More. Prior to the 1930's the word computer would have been defined as: a person who makes calculations, especially with a calculating machine. Today, computers are defined as: an electronic device for storing and processing ...

  22. Free Essay: history of the computer

    Take a Stand Essay. In 1981, IBM sold the first "PC", made from an 8088 processor with a clock speed of 4.77 MHz. It wasn't the first computer ever made, but it was the first computer sold to ordinary people. We have moved on, we have made changes, and we have progressed to enhance computers.

  23. [2403.20331] Unsolvable Problem Detection: Evaluating Trustworthiness

    Unsolvable Problem Detection: Evaluating Trustworthiness of Vision Language Models. This paper introduces a novel and significant challenge for Vision Language Models (VLMs), termed Unsolvable Problem Detection (UPD). UPD examines the VLM's ability to withhold answers when faced with unsolvable problems in the context of Visual Question ...