- NaN/10(0 votes)
#1 - Early Computing
Loading...S1:E1Hello, world! Welcome to Crash Course Computer Science! So today, we’re going to take a look at computing’s origins, because even though our digital computers are relatively new, the need for computation is not.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#2 - Electronic Computing
Loading...S1:E2So we ended last episode at the start of the 20th century with special purpose computing devices such as Herman Hollerith’s tabulating machines. But as the scale of human civilization continued to grow as did the demand for more sophisticated and powerful devices. Soon these cabinet-sized electro-mechanical computers would grow into room-sized behemoths that were prone to errors. But is was these computers that would help usher in a new era of computation - electronic computing.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#3 - Boolean Logic & Logic Gates
Loading...S1:E3Today, Carrie Anne is going to take a look at how those transistors we talked about last episode can be used to perform complex actions. With the just two states, on and off, the flow of electricity can be used to perform a number of logical operations, which are guided by a branch of mathematics called Boolean Algebra. We’re going to focus on three fundamental operations - NOT, AND, and OR - and show how they were created in a series of really useful circuits. And its these simple electrical circuits that lay the groundwork for our much more complex machines.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#4 - Representing Numbers and Letters with Binary
Loading...S1:E4Today, we’re going to take a look at how computers use a stream of 1s and 0s to represent all of our data - from our text messages and photos to music and webpages. We’re going to focus on how these binary values are used to represent numbers and letters, and discuss how our need to perform operations on larger and more complex values brought us from our 8-bit video games to beautiful Instagram photos, and from unreadable garbled text in our emails to a universal language encoding scheme.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#5 - How Computers Calculate - the ALU
Loading...S1:E5Today we're going to talk about a fundamental part of all modern computers. The thing that basically everything else uses - the Arithmetic and Logic Unit (or the ALU). The ALU may not have to most exciting name, but it is the mathematical brain of a computer and is responsible for all the calculations your computer does! And it's actually not that complicated. So today we're going to use the binary and logic gates we learned in previous episodes to build one from scratch, and then we'll use our newly minted ALU when we construct the heart of a computer, the CPU, in episode 7.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#6 - Registers and RAM
Loading...S1:E6Today we’re going to create memory! Using the basic logic gates we discussed in episode 3 we can build a circuit that stores a single bit of information, and then through some clever scaling (and of course many new levels of abstraction) we’ll show you how we can construct the modern random-access memory, or RAM, found in our computers today. RAM is the working memory of a computer. It holds the information that is being executed by the computer and as such is a crucial component for a computer to operate. Next week we’ll use this RAM, and the ALU we made last episode, to help us construct our CPU - the heart of a computer.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#7 - The Central Processing Unit (CPU)
Loading...S1:E7Today we’re going to build the ticking heart of every computer - the Central Processing Unit or CPU. The CPU’s job is to execute the programs we know and love - you know like GTA V, Slack... and Power Point. To make our CPU we’ll bring in our ALU and RAM we made in the previous two episodes and then with the help of Carrie Anne’s wonderful dictation (slowly) step through some clock cycles. WARNING: this is probably the most complicated episode in this series, we watched this a few times over ourselves, but don't worry at about .03Hz we think you can keep up.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#8 - Instructions & Programs
Loading...S1:E8Today we’re going to take our first baby steps from hardware into software! Using that CPU we built last episode we’re going to run some instructions and walk you through how a program operates on the machine level. We'll show you how different programs can be used to perform different tasks, and how software can unlock new capabilities that aren't built into the hardware. This episode, like the last is pretty complicated, but don’t worry - as we move forward into programming the idea of opcodes, addresses, and registers at this machine level will be abstracted away like many of the concepts in this series.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#9 - Advanced CPU Designs
Loading...S1:E9So now that we’ve built and programmed our very own CPU, we’re going to take a step back and look at how CPU speeds have rapidly increased from just a few cycles per second to gigahertz! Some of that improvement, of course, has come from faster and more efficient transistors, but a number hardware designs have been implemented to boost performance. And you’ve probably heard or read about a lot of these - they’re the buzz words attached to just about every new CPU release - terms like instruction pipelining, cache, FLOPS, superscalar, branch prediction, multi-core processors, and even super computers! These designs are pretty complicated, but the fundamental concepts behind them are not. So bear with us as we introduce a lot of new terminology including what might just be the best computer science term of all time: the dirty bit. Let us explain.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#10 - Early Programming
Loading...S1:E10Since Joseph Marie Jacquard’s textile loom in 1801, there has been a demonstrated need to give our machines instructions. In the last few episodes, our instructions were already in our computer’s memory, but we need to talk about how they got there - this is the heart of programming. Today, we’re going to look at the history of programming and the innovations that brought us from punch cards and punch paper tape to plugboards and consoles of switches. These technologies will bring us to the mid 1970s and the start of home computing, but they had limitations, and what was really needed was an easier and more accessible way to write programs - programming languages. Which we’ll get to next week.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#11 - The First Programming Languages
Loading...S1:E11So we ended last episode with programming at the hardware level with things like plugboards and huge panels of switches, but what was really needed was a more versatile way to program computers - software! For much of this series we’ve been talking about machine code, or the 1’s and 0’s our computers read to perform operations, but giving our computers instructions in 1’s and 0’s is incredibly inefficient, and a “higher-level” language was needed. This led to the development of assembly code and assemblers that allow us to use operands and mnemonics to more easily write programs, but assembly language is still tied to underlying hardware. So by 1952 Navy officer Grace Hopper had helped created the first high-level programming language A-0 and compiler to translate that code to our machines. This would eventually lead to IBM’s Fortran and then a golden age of computing languages over the coming decades. Most importantly, these new languages utilized new abstractions to make programming easier and more powerful giving more and more people the ability to create new and amazing things.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#12 - Programming Basics: Statements & Functions
Loading...S1:E12Today, Carrie Anne is going to start our overview of the fundamental building blocks of programming languages. We’ll start by creating small programs for our very own video game to show how statements and functions work. We aren’t going to code in a specific language, but we’ll show you how conditional statements like IF and ELSE statements, WHILE loops, and FOR loops control the flow of programs in nearly all languages, and then we’ll finish by packaging up these instructions into functions that can be called by our game to perform more and more complex actions.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#13 - Intro to Algorithms
Loading...S1:E13Algorithms are the sets of steps necessary to complete computation - they are at the heart of what our devices actually do. And this isn’t a new concept. Since the development of math itself algorithms have been needed to help us complete tasks more efficiently, but today we’re going to take a look a couple modern computing problems like sorting and graph search, and show how we’ve made them more efficient so you can more easily find cheap airfare or map directions to Winterfell... or like a restaurant or something.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#14 - Data Structures
Loading...S1:E14Today we’re going to talk about on how we organize the data we use on our devices. You might remember last episode we walked through some sorting algorithms, but skipped over how the information actually got there in the first place! And it is this ability to store and access information in a structured and meaningful way that is crucial to programming. From strings, pointers, and nodes, to heaps, trees, and stacks get ready for an ARRAY of new terminology and concepts.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#15 - Alan Turing
Loading...S1:E15Today we’re going to take a step back from programming and discuss the person who formulated many of the theoretical concepts that underlie modern computation - the father of computer science himself: Alan Turing. Now normally we try to avoid “Great Man" history in Crash Course because truthfully all milestones in humanity are much more complex than just an individual or through a single lens - but for Turing we are going to make an exception. From his theoretical Turing Machine and work on the Bombe to break Nazi Enigma codes during World War II, to his contributions in the field of Artificial Intelligence (before it was even called that), Alan Turing helped inspire the first generation of computer scientists - despite a life tragically cut short.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#16 - Software Engineering
Loading...S1:E16Today, we’re going to talk about how HUGE programs with millions of lines of code like Microsoft Office are built. Programs like these are way too complicated for a single person, but instead require teams of programmers using the tools and best practices that form the discipline of Software Engineering. We'll talk about how large programs are typically broken up into into function units that are nested into objects known as Object Oriented Programming, as well as how programmers write and debug their code efficiently, document and share their code with others, and also how code repositories are used to allow programmers to make changes while mitigating risk.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#17 - Integrated Circuits & Moore’s Law
Loading...S1:E17So you may have heard of Moore's Law and while it isn't truly a law it has pretty closely estimated a trend we've seen in the advancement of computing technologies. Moore's Law states that we'll see approximately a 2x increase in transistors in the same space every two years, and while this may not be true for much longer, it has dictated the advancements we've seen since the introduction of transistors in the mid 1950s. So today we're going to talk about those improvements in hardware that made this possible - starting with the third generation of computing and integrated circuits (or ICs) and printed circuit boards (or PCBs). But as these technologies advanced a newer manufacturing process would bring us to the nanoscale manufacturing we have today - photolithography.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#18 - Operating Systems
Loading...S1:E18So as you may have noticed from last episode, computers keep getting faster and faster, and by the start of the 1950s they had gotten so fast that it often took longer to manually load programs via punch cards than to actually run them! The solution was the operating system (or OS), which is just a program with special privileges that allows it to run and manage other programs. So today, we’re going to trace the development of operating systems from the Multics and Atlas Supervisor to Unix and MS-DOS, and take at look at how these systems heavily influenced popular OSes like Linux, Windows, MacOS, and Android that we use today.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#19 - Memory & Storage
Loading...S1:E19So we’ve talked about computer memory a couple times in this series, but what we haven’t talked about is storage. Data written to storage, like your hard drive, is a little different, because it will still be there even if the power goes out - this is known as non-volatile memory. Today we’re going to trace the history of these storage technologies from punch cards, delay line memory, core memory, magnetic tape, and magnetic drums, to floppy disks, hard disk drives, cds, and solid state drives. Initially, volatile memory, like RAM was much faster than these non-volatile storage memories, but that distinction is becoming less and less true today.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#20 - Files & File Systems
Loading...S1:E20Today we’re going to look at how our computers read and interpret computer files. We’ll talk about how some popular file formats like txt, wave, and bitmap are encoded and decoded giving us pretty pictures and lifelike recordings from just strings of 1’s and 0’s, and we’ll discuss how our computers are able to keep all this data organized and readily accessible to users. You’ll notice in this episode that we’re starting to talk more about computer users, not programmers, foreshadowing where the series will be going in a few episodes.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#21 - Compression
Loading...S1:E21So last episode we talked about some basic file formats, but what we didn’t talk about is compression. Often files are way too large to be easily stored on hard drives or transferred over the Internet - the solution, unsurprisingly, is to make them smaller. Today, we’re going to talk about lossless compression, which will give you the exact same thing when reassembled, as well as lossy compression, which uses the limitations of human perception to remove less important data. From listening to music and sharing photos, to talking on the phone and even streaming this video right now the ways we use the Internet and our computing devices just wouldn’t be possible without the help of compression.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#22 - Keyboards & Command Line Interfaces
Loading...S1:E22Today, we are going to start our discussion on user experience. We've talked a lot in this series about how computers move data around within the computer, but not so much about our role in the process. So today, we're going to look at our earliest form of interaction through keyboards. We'll talk about how the keyboard got its qwerty layout, and then we'll track its evolution in electronic typewriters, and eventually terminals with screens. We are going to focus specifically on text interaction through command line interfaces, and next week we'll take a look at graphics.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#23 - Screens & 2D Graphics
Loading...S1:E23Today we begin our discussion of computer graphics. So we ended last episode with the proliferation of command line (or text) interfaces, which sometimes used screens, but typically electronic typewriters or teletypes onto paper. But by the early 1960s a number of technologies were introduced to make screens much more useful from cathode ray tubes and graphics cards to ASCII art and light pens. This era would mark a turning point in computing - computers were no longer just number crunching machines, but potential assistants interactively augmenting human tasks. This was the dawn of graphical user interfaces which we’ll cover more in a few episodes.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#24 - The Cold War and Consumerism
Loading...S1:E24Today we’re going to step back from hardware and software, and take a closer look at how the backdrop of the cold war and space race and the rise of consumerism and globalization brought us from huge, expensive codebreaking machines in the 1940s to affordable handhelds and personal computers in the 1970s. This is an era that saw huge government funded projects - like the race to the moon. And afterward, a shift towards the individual consumer, commoditization of components, and the rise of the Japanese electronics industry.
0 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#25 - The Personal Computer Revolution
Loading...S1:E25Today we're going to talk about the birth of personal computing. Up until the early 1970s components were just too expensive, or underpowered, for making a useful computer for an individual, but this would begin to change with the introduction of the Altair 8800 in 1975. In the years that follow, we'll see the founding of Microsoft and Apple and the creation of the 1977 Trinity: The Apple II, Tandy TRS-80, and Commodore PET 2001. These new consumer oriented computers would become a huge hit, but arguably the biggest success of the era came with the release of the IBM PC in 1981. IBM completely changed the industry as its "IBM compatible" open architecture consolidated most of the industry except for, notably, Apple. Apple chose a closed architecture forming the basis of the Mac Vs PC debate that rages today. But in 1984, when Apple was losing marketshare fast it looked for a way to offer a new user experience like none other - which we'll discuss next week.
0 CommentsView allDirector:N/AWriter:N/A

The Best Episodes of Crash Course Computer Science Season 1
Every episode of Crash Course Computer Science Season 1 ranked from best to worst. Discover the Best Episodes of Crash Course Computer Science Season 1!

The Best Episodes of Crash Course Computer Science Season 1
Every episode of Crash Course Computer Science Season 1 ranked from best to worst. Discover the Best Episodes of Crash Course Computer Science Season 1!
In this series, we trace the origins of our modern computers, take a closer look at the ideas that gave us our current hardware and...
Filter By Season1
- NaN/10(0 votes)
#1 - Early Computing
Loading...S1:E10 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#2 - Electronic Computing
Loading...S1:E20 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#3 - Boolean Logic & Logic Gates
Loading...S1:E30 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#4 - Representing Numbers and Letters with Binary
Loading...S1:E40 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#5 - How Computers Calculate - the ALU
Loading...S1:E50 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#6 - Registers and RAM
Loading...S1:E60 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#7 - The Central Processing Unit (CPU)
Loading...S1:E70 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#8 - Instructions & Programs
Loading...S1:E80 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#9 - Advanced CPU Designs
Loading...S1:E90 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#10 - Early Programming
Loading...S1:E100 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#11 - The First Programming Languages
Loading...S1:E110 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#12 - Programming Basics: Statements & Functions
Loading...S1:E120 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#13 - Intro to Algorithms
Loading...S1:E130 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#14 - Data Structures
Loading...S1:E140 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#15 - Alan Turing
Loading...S1:E150 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#16 - Software Engineering
Loading...S1:E160 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#17 - Integrated Circuits & Moore’s Law
Loading...S1:E170 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#18 - Operating Systems
Loading...S1:E180 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#19 - Memory & Storage
Loading...S1:E190 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#20 - Files & File Systems
Loading...S1:E200 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#21 - Compression
Loading...S1:E210 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#22 - Keyboards & Command Line Interfaces
Loading...S1:E220 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#23 - Screens & 2D Graphics
Loading...S1:E230 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#24 - The Cold War and Consumerism
Loading...S1:E240 CommentsView allDirector:N/AWriter:N/A - NaN/10(0 votes)
#25 - The Personal Computer Revolution
Loading...S1:E250 CommentsView allDirector:N/AWriter:N/A
The 20 BEST Episodes of Crash Course Computer Science
READ
Season 1 Ratings Summary
"Early Computing" is the best rated episode of "Crash Course Computer Science" season 1. It scored N/A/10 based on 0 votes. Directed by N/A and written by N/A, it aired on 2/22/2017. This episode is rated NaN points higher than the second-best, "Electronic Computing".