Computer Science: Algorithms, Theory, and Machines

Brought by: Coursera

Overview

This course introduces the broader discipline of computer science to people having basic familiarity with Java programming. It covers the second half of our book Computer Science: An Interdisciplinary Approach (the first half is covered in our Coursera course Computer Science: Programming with a Purpose, to be released in the fall of 2018). Our intent is to demystify computation and to build awareness about the substantial intellectual underpinnings and rich history of the field of computer science.

First, we introduce classic algorithms along with scientific techniques for evaluating performance, in the context of modern applications. Next, we introduce classic theoretical models that allow us to address fundamental questions about computation, such as computability, universality, and intractability. We conclude with machine architecture (including machine-language programming and its relationship to coding in Java) and logic design (including a full CPU design built from the ground up).

The course emphasizes the relationships between applications programming, the theory of computation, real computers, and the field's history and evolution, including the nature of the contributions of Boole, Shannon, Turing, von Neumann, and others.

All the features of this course are available for free. No certificate will be offered upon completion.

Syllabus

  • INFORMATION ABOUT LECTURES 1–10
    • This lesson provides information about the course Computer Science: Programming with a Purpose, which is the precursor to Computer Science: Algorithms, Theory, and Machines.
  • SORTING AND SEARCHING
    • We introduce and study classic algorithms for two fundamental problems, in the context of realistic applications. Our message is that efficient algorithms (binary search and mergesort, in this case) are a key ingredient in addressing computational problems with scalable solutions that can handle huge instances, and that the scientific method is essential in evaluating the effectiveness of such solutions.
  • STACKS AND QUEUES
    • Our introduction to data structures is a careful look at the fundamental stack and queue abstractions, including performance specifications. Then we introduce the concept of linked structures and focus on their utility in developing simple, safe, clear, and efficient implementations of stacks and queues.
  • SYMBOL TABLES
    • The symbol table abstraction is one of the most important and useful programmer's tools, s we illustrate with several examples in this lecture. Extending the scientific approach of the previous two lectures, we introduce and study binary search trees, a classic data structure that supports efficient implementations of this abstraction.
  • INTRODUCTION TO THE THEORY OF COMPUTING
    • The theory of computing helps us address fundamental questions about the nature of computation while at the same time helping us better understand the ways in which we interact with the computer. In this lecture, we introduce formal languages and abstract machines, focusing on simple models that are actually widely useful in practical applications.
  • TURING MACHINES
    • In 1936, Alan Turing published a paper that is widely hailed as one of the most important scientific papers of the 20th century. This lecture is devoted to the two far-reaching central ideas of the paper: All computational devices have equivalent computational power, and there are limitations to that power.
  • INTRACTABILITY
    • As computer applications expanded, computer scientists and mathematicians realized that a refinement of Turing's ideas is needed. Which computational problems can we solve with the resource limitations that are inescapable in the real world? As described in this lecture, this question, fundamentally, remains unanswered.
  • A COMPUTING MACHINE
    • Every programmer needs understand the basic characteristics of the underlying computer processor being used. Fortunately, the fundamental design of computer processors has changed little since the 1960s. In this lecture, we provide insights into how your Java code actually gets its job done by introducing an imaginary computer that is similar to both the minicomputers of the 1960s and the microprocessor chips found in today's laptops and mobile devices.
  • VON NEUMANN MACHINES
    • Continuing our description of processor design and low-level programming, we provide context stretching back to the 1950s and discuss future implications of the von Neumann machine, where programs and data are kept in the same memory. We examine in detail the idea that we design new computers by simulating them on old ones, something that Turing's theory guarantees will always be effective.
  • COMBINATIONAL CIRCUITS
    • Starting with a few simple abstractions (wires that can carry on/off values and switches that can control the values carried by wires), we address in this lecture the design of the circuits that implement computer processors. We consider gates that implement simple logical functions and components for higher-level functions, such as addition. The lecture culminates with a full circuit for an arithmetic/logic unit.
  • CENTRAL PROCESSING UNIT
    • In this lecture, we provide the last part of our answer to the question "How does a computer work?" by developing a complete circuit for a computer processor, where every switch and wire is visible. While vastly different in scale, this circuit, from a design point of view, has many of the same characteristics as the circuits found in your computer and your phone.

Taught by

Robert Sedgewick and Kevin Wayne

Computer Science:  Algorithms, Theory, and Machines
Go to course

Computer Science: Algorithms, Theory, and Machines

Brought by: Coursera

  • Coursera
  • Free
  • English
  • Certificate Not Available
  • Available at any time
  • intermediate
  • Arabic, French, Portuguese, Italian, German, Russian, English, Spanish, Thai, Indonesian, Kazakh, Hindi, Swedish, Korean, Greek, Chinese, Ukrainian, Japanese, Polish, Dutch, Turkish