Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
CS Unplugged: Computer Science Without a Computer (csunplugged.org)
403 points by avinassh on Sept 4, 2015 | hide | past | favorite | 87 comments


I have encountered a lot of people who earnestly want to learn to program. But it is difficult as we all know - and a big part of that difficulty is learning to think in terms of process and small discrete steps. When that is confounded with annoying things like syntax pickiness, dealing with OS oddities and compiler warnings, the result is a lot of long-term confusion in folks learning.

I've found that a deck of cards and some patience can really help people understand the tiny step by tiny step thinking that is fundamental to computer science and programming. Things like sort this row of cards (following only these rules) and, yeah it's frustrating to do that over and over, lets group those together in a new rule called $X, and so on gets that thinking instilled better than any amount of explaining the programming language ever did. Later, introducing the computer in the mix actually seems easier once the student is accustomed to thinking like a computer.


I have to second the deck of cards approach. One particularly effective exercise is to hand someone a shuffled deck of numeric cards and ask them to write down a process for sorting them, using only the concept of bigger or smaller. Given about 20 minutes, most people will come up with selection sort on their own.

It's this sort of introspection that's key to understanding computers. We abstract large chunks of computational thinking naturally. It's essential to be able to ask yourself, "How am I really doing this?" - and to come up with an answer.

edited slightly for grammar


I'm volunteering to run a code club for kids. I saw a great video of how to teach a sorting algorithm in the playground by drawing with chalk on the floor. The kids then walk through the pattern and end up sorted at the end.

It was something like this: http://csunplugged.org/sorting-networks/#Videos


Robot turtles is a cool take on this approach for kids.

http://www.robotturtles.com


I'm very happy to see this. I've taught computer science at the college level. Whether you realize it or not, learning CS has very little to do with typing symbols into a computer. While it's true the computer sometimes surprises us, I've told my students that if you can't run the program in your head, then you simply don't understand what's going on. You haven't learned anything.

More often than not, the computer facilitates a guess and check programming mentality that isn't just endemic to students, but to our profession in general. I'm certainly not immune to the temptation to fix problems by repeated runs with small tweaks: "Does it work? No... no... no... yes! Move on!" That's not computer science. It's data entry.

It's not that computers have no place in the curriculum. They can be immensely helpful in understanding of complex algorithms, much the same way performing an experiment can lead to a more thorough understanding of physics, or chemistry. However, no one would argue chemistry is about moving chemicals from one test tube to another.


> if you can't run the program in your head, then you simply don't understand what's going on

Agreed, and sometimes it's surprisingly tricky to describe what's happening not in terms of the coding abstractions we all know and love, but just in plain English.

I remember I did a contract years ago for a PhD student which was basically doing some statistical analysis of a massive dataset. When I was doing progress reports she'd often drill down and ask, out of genuine curiosity (she was an academic, go figure :-) just how the computer was doing the things I was describing.

I couldn't use abstractions like 'well I just loop over this set and accumulate x and...' because that's gibberish to her - so I had to kind of break down what was actually going on in plain English (obviously with a degree of abstraction from the hardware etc). I did so very awkwardly because I was used to just 'thinking in code', but remember it was very helpful for me to be forced to do this because it really tested that I actually understood what my algorithms were doing.

But obviously there's a limit to the usefulness of this line of thinking. Abstractions are always present - ad adsurdium, we'd never talk about electrons flying around etc. It ceases (except in extremely rare exceptions) to be useful to describe what is actually happening in more detail than 'this black box does X, and I use it with another black box that does Y to achieve Z' right about the level where you're writing 'real' applications with frameworks and whatnot.


I wonder to what extent you were just finding actual abstractions that the two of you could share. "For each number/thing in this list, you do a calculation" is an abstraction from an old school "for (i = 0; i < n; i++)" loop. A lot of modern conveniences in programming are abstractions that ordinary language has had forever.

A lot of people in programming think of abstraction in terms of AbstractFactoryFactories or LifecycleConfigurators and then start to complain about architecture astronauts and leaky abstractions. But there are more basic abstractions that we use every day, which are perfectly precise, don't leak[1] and are near-indispensible.

[1] Except perhaps in terms of performance.


I've told my students that if you can't run the program in your head, then you simply don't understand what's going on. You haven't learned anything.

Absolutely. A coworker has a similar version of that line, which I've committed to memory:

"If you don't understand precisely what you need to do, what makes you think you can tell a computer how to do it?"

I also briefly taught programming, and that's a line I've used more often than I'd like. Another thing I've noticed is that IDEs tend to encourage beginners to perform much random fiddling of code that doesn't work, often creating a mess in the process.


I would be careful here - when I studied organic and physical chemistry, I was very surprised to discover how much of the discipline was empiric. X + Y + heat = Z. Why? Mmmm, ask a physicist.


I completely agree with you here. It's also quite good to practice optimizing or iterating over your implementation of something mentally before commit to typing it out, certainly before repeatly running your changes until it just works.


You really think improvisational programming is equivalent to data entry? Have you ever done data entry?


I don't think improvisational programming (as I understand it) is equivalent to data entry. That also was not what I was describing.

You seem to be a fairly adept programmer. The concept of improvisational programming makes sense for you - you draw on years of experience and training (formal, or not) to feel out a concept when you have a goal in mind but no clear way to get there. You probably perform many small experiments to learn more about the intricacies of whatever you are working on. You can do all this because you've progressed past the beginning stages.

Most beginners simply aren't capable of this. There are too many blind alleys. Too many traps. It's not that I discourage self-directed learning, experimenting or researching - all are essential to learning computer science. Rather I've seen far too many students simply resort to guess and check methodology when face with a problem. This isn't the kind of guided exploration that comes with experience but: "Adding 1 to that index didn't work - I wonder if 2 will? 3? 4? 10? Oh! 12 works! Great - I'm done!"

Incidentally, I have done data entry. My first computer was an Apple II. I used to type in programs from magazines without really understand what they were doing. Even worse, I typed in raw hexadecimal program data. I know what it's like to type something in you don't understand, find out it's wrong, tweak it, and try again ad infinitum (or so it seemed). "Does A9 work? What about AA? AC? ...AF works! Great - I'm done!"


Learning CS without a computer brings you incredibly good foundations when you do program with a computer. Algorithms, data structures, and other theory become much easier to implement. I used to think it was stupid that we had to implement algorithms on paper tests, but a lot of that makes you a better programmer. I still don't think paging memory on paper has any good use for an OS class--just busy work :).


I think there's something to doing it in your head/on paper. Reminds me of the story told of Don Knuth, clipped here from Quora...

Quote from Alan Kay about Knuth:

When I was at Stanford with the AI project [in the late 1960s] one of the things we used to do every Thanksgiving is have a computer programming contest with people on research projects in the Bay area. The prize I think was a turkey.

[John] McCarthy used to make up the problems. The one year that Knuth entered this, he won both the fastest time getting the program running and he also won the fastest execution of the algorithm. He did it on the worst system with remote batch called the Wilbur system. And he basically beat the shit out of everyone.

And they asked him, "How could you possibly do this?" And he answered, "When I learned to program, you were lucky if you got five minutes with the machine a day. If you wanted to get the program going, it just had to be written right. So people just learned to program like it was carving stone. You sort of have to sidle up to it. That's how I learned to program." - [1]

[1] - http://www.quora.com/How-would-Donald-Knuth-fare-as-a-compet...


From today's perspective, it feels odd that those things existed in many people's life times; to be only allowed 'five minutes' access to a computer. Nowadays if you want to program, you could find a used machine to write code on for five hours of minimum wage, or a new one for twenty.


Or rent a VPS for 45 minutes at minimum wage.


But what about the computer and internet connection to access the VPS?


Yes, my point is that getting access to a Linux server on the Internet which you aren't afraid to accidentally break is easy.


Though it sounds like the right way to go, my personal experience points to the exact opposite.

I'm unfortunately cursed that I have a lot of trouble starting a problem till I really really understand all it's details and the details of my solution. I basically write down exactly what I want and am going to do on paper

I find that my coworkers that start with a vague idea of what they want (without a complete understanding of the system) but work in quick hack->fix iterations produce results significantly faster.

I don't have a huge sample size, but that's simply what I've observed. The person that finishes first is the person that starts typing first. The one that mulls over everything in their head might have a more elegant solution and a clearer git repo.. but they always finish last


Knuth wrote the entirety of the first version of TeX on yellow legal note pads, and then typed it all in, and then started debugging. Ditto for MetaFont. Both are the equivalent of about 10kloc (after the Tangle preprocessor removes the voluminous comments).


I would love a reference about the yellow legal note pad thing.

In reality, writing a program in longhand is something everybody could do -- but because nowadays programming is mostly plumbing, you have to empirically test everything.


I saw it with my own eyes when I was in his office to discuss some typesetter-interfacing issues, but I suppose that's not enough of a reference. Perhaps the story is repeated in an old issue of TuGboat, the journal of the TeX Users Group, but I couldn't find it. Here's a more extensive description from a decade ago though: https://groups.google.com/d/msg/comp.text.tex/9quGg7j6U0k/tl...


I don't know exactly when TeX was written, but back in the ed days before vi and emacs, a yellow legal pad was a substantially better full-screen editor than the line editor. Computers were for executing programs, not editing text.


Knuth developed the first version of TeX in 1978, on the Sail mainframe, a 36-bit DEC PDP-10 with a unique hardware graphics system called Data Disc: a single-platter disk with 32 fixed read/write heads, where each of the 32 tracks contained exactly the 512 X 480 bits that constituted one frame buffer's worth of pixels. The amazing hack was that the rotational speed of the disk was just right so that during each rotation, the raw bits coming off of each head produced a video signal with just the right frequency for piping out through a coax to a completely dumb black-and-white video screen! (Actually, there was a cross-switch in there, so that there could be 64 of these special screen/keyboard units, of which 32 could be in use at a time; in fact, you could walk up to any of them and "grab" the channel that you had been using from a different office, or, for that matter, peek at anybody else's channel.)

Anyway, the custom OS ("Waits") made good use of the Data Disc graphics system: it had a built-in interactive line editor, so that when you were in the shell, you could edit your command line (control-d deletes a character, etc., etc.) and see the result in realtime. (This was years and years before Unix got similar features in tcsh and bash and the readline library). All programs inherited this functionality automatically, so Wait's full-screen editor ("E") was simply built on top of it. (Again, years and years before emacs and vi, and all on a system with a per-process address space of only 256K words (about 1Mb), split evenly between data and code.)

So, to finally answer your question: while Knuth did spend his early years programming on punched-card batch systems (where you pretty much had to write out code long-hand before keypunching it), by the time he had started working on TeX he had been exclusively using a full-screen editor on a graphical display for many years.


I've learned to programm like that. Our school did not have any computers, but there was a CS class in curriculum. Luckily our teacher took her job seriously and taught the basics of programming: variables, assignment, loops, algorithms in general. Later, then I did get access to computer it was only the matter of learning the syntax of the particular language. I still had no access to computer at home, so I did my programming on paper and typed in programs later. I think that's good, cause it makes you think about what are you doing instead of just mindlessly permuting code till it works.


The most brilliant programmer I know said his parent's had given him a bunch of money to buy a computer when he went to study abroad. But he used the money for food, and did all of his assignments on paper - only ever typing them in just before they were due in the school labs.

Doing them on paper, he'd get to a point where he had a good understanding of his solution and was extremely confident that they'd work on the day!


Programming on paper with one attempt at a run and doing crossword puzzles in pen strike me as analagous.

I've never programmed on paper to any real extent but doing crosswords in pen certainly made me better at crosswords.


Doing crosswords in pen has taught me that ball points can write very faintly if you use light pressure, so that a later overstrike visually dominates.


I found that doing easy and medium Sudoku in pen is remarkably enjoyable, far more so than it ever was when I would make temporary marks. It's harder, and I'm not very good at Sudoku in general, but it really reinforces the "you have to prove it ..." mindset. Strangely, even though I am slower, I seem to get stuck less often.


As someone who studied CS in school, I appreciate this approach. But I don't know if I would if I were, say, learning to code as an adult. The reality of the matter is that computers themselves are a massive aid in learning how to program...and I mean far beyond the ability to quickly Google/StackOverflow questions.

Being adept at an interactive interpreter, for example, opens up the opportunity for fulfilling (or at least, less frustrating) interactive debugging...to me, being able to debug is at the core of understanding programming. And while that's not pure computer science, per se, it's a great way to not just understand and replicate CS concepts, but to fully test and explore them via immediate feedback. Sometimes I've found that I can only understand an algorithm by implementing it in code, and then tweaking/breaking it to test my assumptions...a computer makes it so that such exploration is not impossibly tedious.


This isn't about learning how to program though. This is dedicated to imparting computational thinking as a skill. It's not going to teach you how to program, but it will make you a better programmer.


Indeed, learning to program and learning to think like a computer are very different activities and skills.


I remember walking in to Comp Sci 101 on the first day and quite a few people had laptops and everyone was surprised to see it wasn't a computer lab classroom. Right off the bat our professor addressed this: "Yes, this is Computer Science but it's really science about computing. You won't ever see a computer in the class room. You'll do programming assignments at home but in here we take notes, talk and draw diagrams. " Felt weird at the time but makes perfect sense to me now.


I have a great respect for Tim Bell, someone who has put his everything into this project to make CS more accessable and get it taught at an earlier age.

I can always remeber back in 2008 when he gave a special lecture to COSC122-S208 with an early prototype of this program. It was obvious that I wasn't the only one left with a feeling of 'why didn't they just tell us that to start with?' after 10 weeks of battling beginner skills at java and trying to learn data structure concepts and algorithms at the same time.


This would be really good for prisoners too, who don't usually have access to a computer but are often interested in learning about computers.


or for some people who don't know how a computer works ;) http://www.youtube.com/watch?v=5Qj8p-PEwbI


Computer Science without a Computer… you mean math? ;)

Joking aside, this can be a very good exercise. I did interviews on the whiteboard, and it did teach me that I have come to rely a little too much on compiling and running to understand the logic I've written.


Your comment made me stop and think about all the times I've seen someone complain about 'coding' on a whiteboard during interviews. Did those people learn 'computer science' entirely in front of a computer, and so sort of 'trial and errored' they're way to the correct answer? Not trying to claim one way is better than the other, but rather trying to understand why the argument over coding in interviews is so polarized.


I think the main problem with "coding on a whiteboard" during interviews is not that you have to reason about an algorithm without the help of a computer, but rather that you have very limited time to do so. You stand there, looking at an empty board, trying to come up with something sensible (or at least, a decent start) in a few seconds. In the meantime people are looking at you, waiting for your answer, which adds just a tiny bit of psychological pressure. :) This is a very different situation from sitting down at your desk and thinking an algorithm over, at your leisure.


I can only speak for myself, but I just want a keyboard and a screen. I don't need a compiler--it's just got to do with the transcription method, not because I need the write/execute/debug loop.

I'd be fine doing a coding interview with just Notepad.exe or vi or whatever. It takes me ~2 seconds to type a line of code, but more like 10 or 15 seconds to handwrite it [0], and my brain's just not used to that kind of latency. It trips me up.

Plus, on a whiteboard, the mechanics of "oops I need to insert a line... I guess I'll just write it down here and draw a big arrow... ok now I need to rename this variable, but now the name is really long and I can't fit it... wait, what was I doing?".

It just mucks up my process. I don't precisely conceive an entire subroutine before I put fingers to keyboard--my code evolves as I'm writing it. I edit, revise, rethink, and refactor constantly, long before anything's even compiled. Keys and screen facilitate that process a thousand times better than pencil and paper. Handwriting just isn't the best medium for code [1].

[0] And thank god I have a CS education, because if nothing else I at least learned how to write legible braces, brackets, ampersands, and at-signs by hand...

[1] OP is obviously a wonderful idea but that's because it's teaching aspects of CS that aren't code.


Yeah, I opened a can of worms with my comment, didn't I? For the record, I'm one of the people who has complained here on HN about whiteboard interviews. Just because they taught me that I rely a bit too much on the compile/run cycle doesn't mean that I think coding exercises at the whiteboard are the right way to do interviews!

I think your use of the term "latency" is a very good way to describe some of the problems around whiteboard coding.


I do not "trial and error" my way to correct answers, but I complain bitterly about whiteboard interviews. It's the wrong medium. I wouldn't mind at all being asked to use a pack of cards as a prop to talk through a sort algorithm, but asking me to write one out on a whiteboard is just hamstringing me.


That is a valid objection. It's best to use the right tool to communicate about the problem and solution.


For me, part of it is literally the whiteboard. I do almost nothing in real life on a whiteboard. I don't like writing on them. I'd much prefer a pencil and paper on a desk.

I'd even somewhat prefer a chalkboard.


I would say, yes. Some programmers have trouble reasoning abstractly or think it somehow beneath themselves because it does not produce something real at the end.

There are also "programmers" who's entire method of work seems to be to copy+paste blocks of code from Stackoverflow, only changing the minimum amount to make something load and then move onto the next item in their hit list.

I always liked white boarding problems.


I've spent a good chunk of my waking life studying programming the past several years, in an effort to get good at this stuff and really understand what I'm doing. I am incredibly jealous that people get to be professional "programmers" without really caring.


Reminds me of competing in the ACM Intercollegiate Programming Competitions. With 3 students and one computer, you had to solve the problems offline and only spend keyboard time for entering and running your program. It really teaches a skill-set that's hard to appreciate if you always have a computer and the compiler in front of you while working out solutions.


Yes. I thought that really artificially skewed the scores, though. When we went to the finals one team member (who normally achieved one run, one pass), keep flubbing his entry. His protracted time in the seat meant the rest of us were artificially restricted at getting our problems done. I had an entire program written out but was never able to even type in. I suppose you could argue that we should have managed that resource better, but you cannot realistically predict someone is going to have an off day; it was 'clear' that he understood what the problem was and the next change was going to be successful. Ah well, it was a lot of fun.


I should get round to submitting my updated version of the 'have a bunch of kids be logic gates & simulate a binary adder' exercise I found on the net a few years ago which has gone down really well the last few times I've run it.


I'm teaching an architecture course, and I would really appreciate it if you could share the link.


http://intellectualicebergs.org/kidputer/ was my original source.

I’ve tweaked the exercise a bit though, so I’ll see if I can get those changes written up - the original author gave us permission to use the work as we saw fit back when I worked at the Dept of Comp.Sci. in Oxford.


There is no problem with using a computer for teaching. The problem is the nature of the modern personal computer. It's too usable for things other than programming, which constitute numerous distractions.

Maybe what you need for teaching is a computer with a programming language in the firmware, which gives you that language's REPL (and nothing but that) within a fraction of a second of powering up.


Strong agreement. As a kid I learned more programming skills from my HP-48 (a forth/scheme hybrid REPL posing as a calculator) than from the PCs I mostly used for video games.


As a parent, I'm using my old 8-bit machines to teach my kids computer science. It worked then, and it still works today - in fact, they're becoming more and more the locus of activity, the slow, cranky old 1mhz machines in the retrobattlestation, but they're still getting massive use - the value of them is: no Internet. You only get what you put into it. And my eldest 8-year old is having a blast learning how to make pixel-graphics, old school .. finally a use for his math!


I'm seeing a lot of comments here about the format and how it's not great to learn CS without a computer, etc etc.

I personally think this site is AMAZING and I plan to share it with many of my friends.

Perhaps this is because many of my friends, like my wife, are elementary school teachers and I've seen the kind of resources they usually have to work with.

As far as resources for elementary school kids, this site is far and away the best I've ever seen. They clearly took their time to do things well, knew their target audience, made a site that is super useful to teachers.


The materials here are far more engaging as well than hopping on a computer.

Funnily enough, during my first year of my Comp Sci degree I didn't have a computer... I used to print off Java Docs, go home, write a program using pen and paper, and go to uni the next day to write it up. Of course this was what people used to do back in the days of punchcards as well!

Not to say that you should go without a computer, but people in their bubbles seem to think that everyone has a computer, and this just isn't the case - not even in 2015.


Which site are you talking about?


The one linked at the top of the page, I presume.


I'm building a K-12 comp sci curriculum for a network of schools and we're pulling a LOT from this. Why? I previously taught an intro CS course to 8th graders with the "jump straight to the code" method, and found that many, if not most students struggled with conceptualizing and articulating algorithms, even in Scratch which is very visual.

Moving forward, our curriculum is going to build those skills up front and off-screen, then move to code. How we build those is a continuing work in progress, but CS Unplugged is a big part of that.


Maths should be preparing for this in advance. after all it's just maths, even with the sorting and all. We are not talking about systems programming or such like. Still, as others mention, having a pc with a debugger and other distractions is ... distracting.


"Learning music by reading about it is like making love by mail." - Luciano Pavarotti


"Computer science is no more about computers than astronomy is about telescopes."

(Attributed to) Dijkstra.

"Computer science isn't a science and it isn't about computers"

(www.jonahkagan.me/projects/writing/cs-essay.html, but I think that isn't the original)

You can learn music without ever touching an instrument, but you can't learn playing an instrument that way.

Similarly, learning computer science is not the same thing as learning to program or learning software engineering.


>"Computer science isn't a science and it isn't about computers"

Earliest use of this quote that I know of is from Abelson in his 1986 lecture teaching SICP: https://youtu.be/2Op3QLzMgSY?t=24


Hence, we should call it "Information Technology" instead.


EWD pioneered or refined on so many aspects of concurrency, multitasking, distributed computing and imperative programming that are taken for granted today, while scantly using computers in general. Wrote a multitasking OS on paper two years before he ever got the chance to implement it on real hardware.

In fact, doing computer science or theory of programming without a computer was remarkably common due to scarce access to hardware, even as recently as the late 80s with people writing BASIC programs on paper before ever getting access to a PC. I'm sure it still happens.

I actually think that teaching CS and programming as hard pragmatic subjects based on doing real-world programming, is likely harmful. The impedance mismatch between contemporary, enormously complicated software systems with the theoretical essentials are too great and may lead to the risk of being unable to think sufficiently abstractly. That is to say, the risk of associating one implementation of a name with the actual first principles of the name in question. See: People being baffled by Plan 9 because they associate "file" as something more specific than a tree node, or generic representation of data.


On the other hand, you don't need to hire a cast and crew to be a playwright.


"Sounds like there might be some money in that." - Hugh Hefner


“Talking about music is like dancing about architecture.”


This is the reason Europeans CS is generally much more theoretical than American CS: in the early days of computing, Europe was rich in talent but not so much in hardware, so focused on computer science in a purer sense; the Americans had plenty of hardware, so focused more on pragmatic aspects.

This still has some effects today in department cultures even though hardware is readily accessible to all now.


It would be interesting to hear more about that. Do you have a source you could post a link to? Cambridge University in particular felt (in late 90s) like it might have been affected by that.


When I was at uni we took CS Unplugged around primary schools to play some of the games from the materials and teach kids some of the basic concepts. It was heaps of fun and I highly recommend that other people do the same!

In particular I love http://csunplugged.org/error-detection/ as it's a magic trick. I've used this at parties, and I find it funny that it's a computer science concept.

Tim Bell's also super nice, and it's fantastic that he developed this!


This is the "new" computer science, the one with no math


Reminds me of this approach:

http://everyonecanprogram.com/

UK schools are teaching programming with marble runs.


I went through the K&R C book writing out the exercises and notes in a notebook. I had limited access to a computer at the college I was attending. Thankfully I was well beyond my "Intro to UNIX" class and that granted me the free time to play with the code I had written in my notebook. I think learning to code this way can be a huge aid in conceptualizing certain CS / programming topics.


In my intro CS course the first day the instructor said they don't teach programming and there was zero credit lab for that. He said this is the design and implementation of algorithms. The labs were 3 hours, zero credit, taught in C and on Linux. If I remember the machines in the lab were running Afterstep, I used Window Maker at the time on my personal machine. Memories...


I had a 3rd year undergrad CS class in "Programming Paradigms" where the final exam involved writing programs in Prolog, Lisp, C, and Fortran ... With paper and pen. It was awesome. As I recall I scored a 100 and was a TA for the course the following year (in my 4th yr)


i dropped out so i can get a job to buy a computer to keep studying :( very bad financial situation


This doesn't make sense in a world where Raspberry Pis cost $35


Not much use without a screen, mouse and keyboard. But yes, if someone really wanted it bad enough, they'd make it happen.


Thanks for sharing, will pass to fellow non-native programmers for them and their kids. I'm afraid to say that in a productive environment CS is still way more a technical effort than a cognitive one for "us". But Python helps, aha!


HOOSERS - Basketball movie (The best sports movie ever) had the first practice was performed with out a basketball. This reminds me of this.

Love this organization with the activities and activities with videos. Great presentation to me.


The contents of the source seems to be good, but somewhat noise - too lot of intro, preliminary info, table of contents, etc. It's takes digging to get to actual knowledge.


This is excellent. Great to get kids interested. Heck, my mom would be interested too.


This is so cool! Are there any similar efforts worth bookmarking?


Kind of ironic that it's without a computer yet you need one to access their site :)


Or you can print it out and give it to someone who wants to teach kids some CS!


That's what we have actual print books for, and PDFs that you can print out and share.

Ironic is when I take Computer Science Unplugged activities and then put them in a virtual world environment :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: