I ask most candidates to write code on a whiteboard in front of me and I’m not apologizing for it!
The problem is usually fairly easy and if the candidate cannot get to a “describe solution in words” milestone then I will guide them to one. After that I ask them to write code. I will also tell them “the code is the part I care about”.
The specific competency I am evaluating here is “can you turn thoughts into code”. I repeat that phrase in interview training so much I think people make fun of me for it.
Imagine I give you a python list with 100 elements and I ask you to write code that will find all instances of the number “5” and move them to the front of the list. I don’t care about performance.
You know you have to write a loop and whenever you see a 5, remove it and put it at the front. Easy. Not even really an “algorithm”. Some people can make code happen very easily and accurately. But some people really need to really think about it - what control flow to use, off by one errors, bounds issues - and make mistakes. Some people don’t see their own bugs. Some people do weird stuff that makes me think they haven’t seen a lot of code before.
I teach interviewers to evaluate the act of the writing as much as the end product, kind of like when airport security asks you “where did you stay in New York” and doesn’t really care about the answer so much as how shifty you look when answering it. It doesn’t mean you have to materialize perfect code on the board to pass - not even close! But this exercise provides information, and when other exercises corroborate that information we use it to make a hire/nohire decision.
Anyway, bottom line is whiteboard code is a completely reasonable technique to deploy in an interview setting and if you do this you shouldn’t feel bad about it. Much more depends on (a) whether the interviewer is trained and calibrated and (b) whether the company knows what it is even trying to evaluate than the question format.
Thank you for writing this post. Whiteboard code interviews get a lot of hate on HN and I suspect that folks who have come to rely on them for part of their hiring signal have simply stopped posting.
I've done a lot of lower level systems work and the fact is that even with a long career and history of very successful product I get whiteboard coding interviews when I look for a job ... and it's just fine. Yes it's not the same as day to day work, but it's related, and I have to admit that I find them kind of fun.
Maybe there are 2 types of "programming jobs": 1) you will have to implement data structures and algorithms, or 2) everything will be provided for you in a language or framework. For job type 2) CS is not needed - it's not a computer science related position ... it's "frameworking". I'm guessing that a lot of the whiteboard hate comes from type 2) people applying for type 1) jobs, OR type 2) companies asking type 1) questions.
I would like to actually figure out what is the bet way to do interviewing for coding positions but it's super hard to actually have these kinds of discussions on HN now without people getting upset at even suggesting whiteboard coding interviews.
> Thank you for writing this post. Whiteboard code interviews get a lot of hate on HN and I suspect that folks who have come to rely on them for part of their hiring signal have simply stopped posting.
It seems like there's a population of engineers that is perpetually interviewing. I think the people that post significantly about this are people that are struggling with frequent interviews, and correctly or not, lash out at whiteboarding as why they are having trouble.
Maybe there's some large pool of otherwise qualified people that just can't pass interviews and are being passed over because of it, but I really doubt this is significant.
My personal opinion: I don't mind white board interviews. I'd do a take home test, but I would prefer a traditional interview. And being a short term contractor (also comes up frequently in these threads) as an interview process is obviously impractical.
The pay in software engineering, esp. in the bay area, is very high. Companies here never have enough engineers for one reason or other - the situation makes current engineers always looking for higher pay regardless of their skill level, and people not in the industry wanting to be in the high pay field.
As a hiring manager, I agree wholeheartedly with the root comment.
I love whiteboard interviews. Really. I do.
The problem is -- most start-ups are terrible at them. And most people interviewing at FAANG don't understand the reason those companies ask these types of questions.
Top tech companies ask hard interview questions for a few reasons.
They're not necessarily trying to hire the best people. They're trying to ensure the people that they do hire aren't incompetent. Hardly anyone incompetent is going to pass Google's interview process. Period.
Why are they so focused on not hiring incompetent people? It's not easy to fire people for being incompetent (or even worse -- toxic).
So if they're just trying to not hire incompetent people, why are the algorithms questions MUCH harder than the problems you face on an average day?
Two reasons. One, they have so many people interviewing, why would they not set the bar arbitrarily high? Two, the amount of people willing to cram and do almost anything for the job means they HAVE to set the bar high.
Now let's talk about the beef with start-up whiteboard interviews. Most of these interviewers have never been formally trained. They're just winging it. They ask questions that don't lend themselves well to a whiteboard. And they don't know what to look for during the interview!
A good whiteboard question does not rely on a trick or an obscure data structure. It doesn't have any gotchas that if you "get it" make it much easier.
A good question is slightly difficult to solve, but has many possible ways to solve it -- each with it's own trade offs. Ideally, none of them are much easier or harder than the others.
It's MUCH more impressive if a candidate can come up with 2-4 ways to solve a hard LeetCode easy problem or an easy LeetCode medium problem -- then for a candidate to happen to know the right combination of obscure knowledge to solve 1 LeetCode hard problem.
If you have 2-3 whiteboard interviews, and a candidate can come up with 2-4 totally different ways to solve the problem, and he materialize those thoughts into code -- there's a very slim chance s/he's bad.
The ideal candidate can discuss the tradeoffs of different solutions, and ask clarifying questions about the usage to figure out an optimal solution.
When you get to performance tweaks for edge cases, can the candidate weigh the pros and cons of adding the complexity for a certain gain.
Is it worth it to add some complexity to class to take an algorithm from 2n to n? Maybe. It depends on a lot of things...
I once worked with a guy who would ask a question that could be solved in O(N) time and O(N) space somewhat easily. But the question could also be solved in O(N) time and O(1) space IFF you knew some (discreet) discreet math. Almost everyone who didn't get the O(1) space solution he would say was a no hire.
Probably less than 10% of the engineers I've worked with have taken discreet math...
Another guy would ask a question that MOST candidates couldn't even comprehend, let alone come up with a solution. Most of the people that solved it would solve it O(n!) Or O(n^3). Occasionally, people would solve it O(n^2). But there is an O(n) solution. And that's what he was looking for. Literally NO ONE ever got it.
If you're testing for their coding process, why do you make them write it out on a whiteboard? Why not provide them a code editor or at least text editor?
I like whiteboard interviews, but they should be testing high level design ("describe solution in words / detail in pseudocode"), not coding skills.
How is providing a code editor or text editor different than writing it out on the whiteboard?
I do a similar process and don't care if their syntax is exactly correct. I, maybe different from the person you're responding to, don't care if they use exact libraries in the precisely correct way. But I do care that they're able to come up with a plausible solution.
At a point, I see people gripe about whiteboard interviews and "specific answers" to questions enough that I'm all in on whiteboard interviews. If someone has an attitude problem over it, I don't want to work with them. God only knows how unwilling they'll be to do something simple like look into the source code of some library they're using. "The job description didn't say I had to do this!"
I realize I am being overly-emotional here, but the HN and broader attitude problems around something as simple as writing a for loop on a whiteboard irk me so much at this point, babying and coddling people, apparently older people who I thought knew how to adapt to whatever life sends their way since they weren't the everyone-gets-a-trophy generation.
The process of scribbling characters with a marker on a whiteboard is quite tedious and takes up a lot of time and effort. And doing it feels more akin to multitasking than a normal coding process, since so much of the effort goes into scribbling out things (as opposed to the actual coding process). Ability to draw glyphs on a whiteboard will become a confounding factor here.
Being able to type out things in a text editor simply makes the interview process much more efficient.
Note: I share your annoyance with people who gripe about whiteboard problems and algorithms / data structures problems not being representative of everyday work.
> How is providing a code editor or text editor different than writing it out on the whiteboard?
At least for me, it's vastly better. In an editor, I'm in my element, I have autocomplete, things work they way they usually do and everything is much more comfortable.
I could even throw together some quick tests and test drive my code!
That's if your used to using your laptop not your desktop, you didn't forget to charge it, the WiFi is working, you aren't missing your mouse keyboard montitor, etc. Ive had candidates many fuck all that up and spend 15 minutes fixing the ergonomics. Guess what it's a 25 minute coding exercise, good luck. If I give them a laptop, same deal.
After it all I still give them the option. Their computer ours or whiteboard.
Statistically ive found that the in ide ones don't get as far because they get hung up on syntax or project setup.
Interviewer and candidate tend to let things like missing curlys slide on whiteboard or just text editor. I don't even care if you get the size function name correct on a collection (I can't remember between the 7 languages I use) or similar.
This is data pulled from 500 interviews.
I don't think places would stop you if you wanted to use your laptop but I do think you're actually optimizing for the wrong thing.
I'll admit this mostly comes from frustration standing by a whiteboard. If I did the same thing on a computer, I'd maybe just have a different set of frustrations.
As someone who does a lot of TDD, I is quite frustrating that a whiteboard can't run tests though...
What I have seen work well, from both sides, is interviewing by pair programming. Not for everyone, for sure.
Just write the test cases as bullet points and actually run through the algorithm with them when done coding. Don't declare success the instant you wirte that last curly.
Doing that gets a big plus from me (I'm 100% tdd personally).
Remember I know the answer to the question. The instant I can tell you do and for reasoning is good, I'm going to stop waisting time getting fro 95% sure to 100% sure on that part of the question and get another piece of data.
Usually that for me is to add some system design, speed consideration or other area of investigation and try to get another overall compentency covered. And just keep doing that till we run out of time.
It's very often other interviewers aren't able to hit things so backup signals are great.
And interviews are risk mitigation really. The more 80% certainties I can get on the more areas of coverage the Better if the interview team is doing a good job.
As an interviewer I'd prefer a whiteboard and an open discussion. With a computer I have to deal with pedantic reality instead of discussing the concepts or approach. You also need a projector or a large screen if it's more than one other person in the room.
At one interview I was given a coding test (this was alone and timed). I was put into a separate room and pointed to a computer. It was a Windows box and Notepad.exe was the only thing I saw. I may have found Nodepad++, but I'm pretty sure it didn't have a Python interpreter. I hadn't used a Windows box in 10 years and my preferred environment is Vim. After the test they pulled me into a different room with the interviewers and discussed the code. This test was terrible for multiple reasons. I was asked to interview by a senior person who was working there who admitted that part of the process sucked and was pushing to improve it.
Even at large places that invest heavily in hiring, I don't see them developing/maintaining a test candidate environment, they're not likely going to hand you someone else's account in a production environment, and if they have a generic PC with a guest account, things are going to be missing or out of date. It'd be preferable for me to bring in my own laptop (but that discriminates against people with desktops or who don't have home computers).
I agree with the 'easier discussion' aspect of a whiteboard, but I feel that any whiteboard test should therefore be asking for pseudocode and explicitly disregarding the pedantic reality of syntax.
Yes. I do think there's a benefit in having a reference language--just don't be pedantic. For example, certain design patterns might be needed for C++ that are built into a language like Python. Knowing which libraries the candidate (or the company) leans on is a good topic to discuss. Pure pseudocode might be too generic to suss this out. It depends what you're evaluating them for. There's little to no value in testing them on things like mismatching parens or typos in variable names (I've heard stories about people getting called out on those things using a whiteboard).
> How is providing a code editor or text editor different than writing it out on the whiteboard?
Muscle memory. If you do all your coding in an IDE on a keyboard, it's quote common not to remember how to write a basic loop because at that point it is so engrained in your muscle memory that you just sort of hit the right keys.
If I gave you a foot operated keyboard and then asked you to take a typing test, would that be fair? It's still typing, it's just a different way of typing.
I give whiteboard interviews. Believe it or not, I am intelligent. I am not going to disregard you because you got some minor syntax wrong on the whiteboard. I can tell the difference between a fundamental misunderstanding and a mistake that happens due to using a language that wasn't designed to be written on whiteboards. If you, the candidate, don't understand that, then I guess you are not fit for the role anyway.
I don’t really understand the whiteboard hate either. Take home tests don’t reveal how a person communicates or collaborates. Maybe it’ll predict competency for siloed heads down work, but even for those positions you need to be able to explain solutions, get feedback on the approach, and be able to incrementally adapt to new requirements quickly. Even when programming solo I’d wager most people are using pen and paper to sketch out solutions so it’s not like it’s an implicitly unreasonable request to then do this on a whiteboard in a group setting.
And to address the OP more directly, are loops, maps, trees and graphs really “CS trivia”? They seem more like fundamentals to me, and those are usually the subjects whiteboard tests revolve around.
As a manager, finding folks who can turn thought into code is certainly required. However, a whiteboard does not and cannot measure that. The only effective way to measure the ability to turn thought into code is for them do it for real. Writing on a whiteboard is simply not coding nor is it an adequate proxy.
Hiring people is also about portraying your company in a positive light. Your allusions to the TSA security screen, and how you have to be "trained and calibrated" seem to imply a robotic kind of detachment that would probably be very off-putting to me.
Often employers seem to be of the mind that they have the treasure and it's up to them to filter out the best candidates (sellers market I suppose) regardless of how they appear in the process. I would imagine that a lot of brilliant people get turned off by this. Which (oddly enough) is also fine by me - like dating, I'd rather spend my time finding an open minded/humane place to work, and I find hiring practices are often a rough initial proxy for that.
The problem is usually fairly easy and if the candidate cannot get to a “describe solution in words” milestone then I will guide them to one. After that I ask them to write code. I will also tell them “the code is the part I care about”.
The specific competency I am evaluating here is “can you turn thoughts into code”. I repeat that phrase in interview training so much I think people make fun of me for it.
Imagine I give you a python list with 100 elements and I ask you to write code that will find all instances of the number “5” and move them to the front of the list. I don’t care about performance.
You know you have to write a loop and whenever you see a 5, remove it and put it at the front. Easy. Not even really an “algorithm”. Some people can make code happen very easily and accurately. But some people really need to really think about it - what control flow to use, off by one errors, bounds issues - and make mistakes. Some people don’t see their own bugs. Some people do weird stuff that makes me think they haven’t seen a lot of code before.
I teach interviewers to evaluate the act of the writing as much as the end product, kind of like when airport security asks you “where did you stay in New York” and doesn’t really care about the answer so much as how shifty you look when answering it. It doesn’t mean you have to materialize perfect code on the board to pass - not even close! But this exercise provides information, and when other exercises corroborate that information we use it to make a hire/nohire decision.
Anyway, bottom line is whiteboard code is a completely reasonable technique to deploy in an interview setting and if you do this you shouldn’t feel bad about it. Much more depends on (a) whether the interviewer is trained and calibrated and (b) whether the company knows what it is even trying to evaluate than the question format.