Hacker News new | past | comments | ask | show | jobs | submit login
How to Ace the Google Interview: Ultimate Guide (byte-by-byte.com)
190 points by samgh 11 hours ago | hide | past | web | favorite | 138 comments





Strange, I have yet to meet a single Google interviewer who was looking for perfect syntax during the interview. Last year I forgot the syntax for a data structure, told my interviewer "something like this," and he just said "that's fine." Got the internship later on. I even had one interviewer who was ok with me writing out matrix algebra mathematically instead of using np.matmul and all that.

As an interviewer, I can confirm that I don't care about syntax or whether the program compiles if I'm convinced their solution and approach would work. I'm also OK with candidates using placeholder helper functions or shorthand for trivial things (e.g. null/undefined check in JS) if they explain to me verbally what that part is supposed to do.

I also interview software engineering candidates at Google (n=150) and while I mostly agree, I do think there's some signal in whether a candidate can get the syntax right. It's not a dealbreaker if they don't, but all things considered someone who comfortably writes code all day is more likely to be able to write syntactically correct code than someone who doesn't.

The main things I want to see, though, are: can you communicate well about the parts of the problem that aren't clear to you? Can you analyze and compare solutions? Can you figure out something reasonably efficient? Do you understand your solution well enough to code it?

(Speaking for myself, not my employer.)


> I do think there's some signal in whether a candidate can get the syntax right

I agree with that. I've had candidates explain certain points of syntax as they worked, or demonstrate their knowledge in other ways, or write their code particularly well or cleanly, and I make sure to mention that positively in my notes. But less-than-flawless syntax by itself isn't a negative for me.

I agree with the main things you look for - I too try to focus on those more than syntax.


I can confirm. In one of my interviews, when writing Python, I used arr.push instead of arr.append (I don't usually write JS, but idk what happened), and I realised the mistake only half-way through. The interviewer noticed it apparently, and he said he didn't care. I imagine he didn't want to throw me off my thought train, which was nice. Got the offer too, so it really must not have mattered.

A friend said recently, "people want to be employed without becoming employable". These guides really exemplify this obsession. Sure, Google has a nice salary and good perks and whatever. But after you get the job, you have to do the job. I wonder if the people who read these guides and try to study just the right topics to get a job, whether they actually like programming.

These guides act as optimizations, shortening the path you need to take to get the job, shortening the stuff you need to learn, etc. But in the end, the path is all you get. If you don't like programming and if you don't like learning, then are you really gonna like Google?

I suppose there's people who genuinely like programming who just need a manual to teach them how to play the game. Lord knows I've practiced my fair share of whiteboard problems when I'd rather be reading about compilers. But there's something wrong about having to play a game to get the job.


This is getting ridiculous. These guides to interviewing at specific companies are starting to sound like the video game cheat code books of old.

If the process is so nuanced that there's an entire industry around these types of guides (and Google even highly recommends you buy them!), then the process is fundamentally flawed.

But we already knew that, and as long as others are still playing the game, we are forced to play or miss out.


I completely agree.

Even Google suggest to "practice writing syntactically correct code on a whiteboard". This is clearly a useless skill as a software engineers except in getting a job at companies that do whiteboard interviews. Did you try to refactor code on a whiteboard?

How are they able to find people that are able to efficiently debug problems?

When I interview people I tell them, "Bring your own laptop set up to be able to code and debug". And I give them "Fix this site" or "build this thing" kind of problems.

It looks like that it works a lot better to find "hidden gems" and people that are good at "doing" instead of those that are jsut good at "telling".


> This is clearly a useless skill as a software engineers except in getting a job at companies that do whiteboard interviews.

Have you ever actually been on the interviewer end of the process?

Literally over half of the candidates literally don't know how to program! They can sort of string together a Markov-chain something if you sit them down in front of an IDE and let them copy-paste stuff until syntactic errors go away, but put them at a whiteboard and they don't know where the parentheses go in a function call.

(They'll write crap like "()f" or "f()a" when they want to call a function, stuff like that.)


Elite schools do that in undergrad; e.g. as an initial scored lab exercise, write this recursive fractal shape using Logo, parallel Delaunay triangulation with prefix sums , dynamic programming solving oligopoly problem or single-value Paxos pseudocode on a piece of paper in 10 minutes (I am being serious). If you can cope with it, it immediately shows up in the interview and you are considered a member of the club, a person worthy of having conversations with.

>f you can cope with it, it immediately shows up in the interview and you are considered a member of the club, a person worthy of having conversations with.

This is the best part of these interviews. The utterly dehumanizing nature of it. You aren't even treated as a person until you've uttered the correct series of phrases that signal you are a member of the right class. Every word that comes out of your mouth, and everything on your resume up until that point is completely disregarded and treated as a lie.


Which elite schools? I went to MIT and didn’t have to do any of this.


Just to add context: MIT Interim Activities Period (IAP) is a "winter break" few weeks in which anyone can lead classes/sessions on any topic: http://web.mit.edu/iap/about/index.html

MIT CSAIL is the research lab joining of the legendary MIT AI Lab and MIT Lab for Computer Science. People who work hard and win the lottery to do research at CSAIL (or other prestigious lab) shouldn't afterwards be looking for entry-level coding jobs for which a whiteboard code monkey dance interview/hazing would be appropriate, IMHO.

(For different reasons, people who are in programmer career tracks, with verifiable industry track records and/or open source involvement, also shouldn't be put through the entry-level hazing. Claims that the ritual gives certain companies metrics or somesuch would carry more credibility, had those companies not been caught brazenly colluding, at the CEO level, to systematically suppress wages and mobility of their own employees, with presumed spreading market effects throughout industry.)


The course you linked to doesn’t cover any of the algorithms mentioned in the grandparent. It does however cover about three dozen common interview questions. The course is also from 2009.

That's a student taught class. Not really policy-defining.

It's a Caltech and Stanford thing. MIT isn't really elite.

Oh not in Utica, it's more of an Albany expression.

I see.

Sure.

Course numbers or it didn't happen

I’m having a hard time believing this is true....

If nothing else, Logo? In 2019?


I think the term "Logo" stands on a perfect abstraction level that in reality could have meant "using turtle.py you programmed last week with turn, move etc. commands", don't you think? Do we need to be super-precise down to minutiae when discussing informal stuff? Moreover, it's on paper, so who cares what syntax is the pseudocode, it could be even Logo itself.

Hackernews has its own instagram reality to it.

I’ve heard that Google allows you to type code into a computer now during an interview. A welcome improvement in the state of the art.

Still no access to a compiler or debugger, but baby steps.


I had to write code exclusively on white board last month. Only laptop in the room was interviewers which he was using it exclusively to furiously copy the code i was writing in the whiteboard.

He said it will compile it and submit report when he gets back to his desk. :/


> He said it will compile it and submit report when he gets back to his desk. :/

So, he's asking you to do something he can't do? Why can't he just read your code and know how it will behave?


Sounds like a great opportunity to whiteboard an exploit instead.

I'm a moron and did something stupid not too long ago that would be perfect for this.

I'm too lazy to find my actual code but here's the gist:

C# but might working in Java/etc too *

Create an Image of some size

loop hight of image

   loop width of image

      Bitmap b = cast Image to bitmap since Image doesn't have the pixel method

      get pixel value or whatever looks legit 

   end
end

This creates a new bitmap for EVERY pixel in the image. If the image is large enough and the system is 64 bit bad things happen. On Lubuntu this code burned through 8 GB of ram in seconds and was well on its way to eating all the virtual memory before I forcefully shut it off.

Maybe this is a Linux issue but it hard locked my system. I couldn't switch to a different terminal or anything.


I've done nearly 100 interviews at Google, at least half of them with me copying code from a whiteboard, and I have never tried to compile a line of code that a candidate wrote.

I also go out of my way, to make it clear that I don't care about every hanging parenthesis, indentation, or typo.

I care about whether or not the candidate asks for clarification, or bulls ahead with assumptions, whether the overall algorithm works, whether the candidate can identify limitations of, and bugs in their solution (Everyone has bugs. Everyone. There's nothing wrong with that.) and what their testing strategy is.


I wish more interviewers were like this

I'm totally sympathetic to how stupid these interviews feel.

Syntax is probably something that without fail novices screw up. I saw this in my own subordinates who simply would not see on their own laptops, even with the red squiggle underline, syntax errors in their own code. Also, I'm sure Google has data on its interviewing process and found in some important, measurable way that there's a problem with hires who screwed up the syntax. Correspondingly, none of my decent subordinates ever screwed up syntax once. I think this is the least controversial part of their process because it is dumbfoundingly easy and low cost.

Engineering interviews are overall a mature process and I honestly doubt there's that much innovation in terms of raw skills discovery. TripleByte, for example, doubled its multiple choice question count from 18 to 36, introducing design questions, and now has a brief 45 minute free programming problem. Hardly a huge innovation.

Clearly what's immature is the feedback and communication to candidates that fail. At Google specifically communications problems don't just crop up in threads full of rejects. Bad communications affect people working there, like product managers, admins and designers, who lack the skills they test for despite the tests themselves not obviously corresponding to anything the engineer actually does day-to-day.

It's bad for morale to test for stuff that doesn't matter. That may explain why smart people report the engineering org is not welcoming to people of different backgrounds. The empathy (or savvy) needed to throw out stupid tests is the same kind you need to understand people who don't share the same culture or values as you.

Google doesn't really know that because the company only knows the things it measures. It takes insight too to know what to measure and how things are related, especially when dealing with human beings and not software.


I agree with your comment in general, but, I worry about the bias of "Bring your own laptop set up to be able to code and debug" - some perfectly qualified candidates don't have laptops. Some perfectly qualified candidates do have a laptop but don't code much at home, and the setup they're used to is their work machine or a school lab computer. We already have too much bias in favor of code-all-day-code-all-night candidates (e.g., looking for GitHub profiles) and I think requiring someone to bring their workspace might push this to the point where good candidates who happen not to code outside business hours wouldn't stand a chance. (There are lots of reasons for this that wouldn't impact how qualified you are, from "I have a family" to "My workplace is real stingy about open source, so I haven't bothered to set anything up for non-work coding".)

I'd be a lot more comfortable with "We have a machine set up for you, but you can also bring your laptop," as long as the machine is actually well set up and you don't fall into the implicit expectation that passing candidates will bring their laptop anyway. (Most of them will, in the end.)


> some perfectly qualified candidates don't have laptops

So get one. You can get a good one from the pawn shop for $200. You can afford that if you're interviewing for a 6 figure job.

Edit: I'm not kidding. I've bought $200 laptops from the thrift store, usually for travel purposes so I don't worry about losing/breaking it.


When I was looking for work, my laptops setup was embarrassing. MacBook Pro with both a broken keyboard and misbehaving trackpad. I had to bring with me an external mechanical keyboard and trackpad. It worked out in the end but that could leave a bad impression with interviewers.

It wouldn't bother me any. I'd just think you were thrifty, and were capable of working around problems to get things done.

Or it means the process is well defined enough that you can create a level field by telling everyone how it works through guides like this instead of giving connected people a leg up due to inside knowledge of a poorly defined process.

Yup. It's in th same spirit as SATs - people also practice for that.

I can guarantee you that more than 90% of people working in highly compensated positions at the world's most famous corporations have had a lot of "leg up" benefits in their lives.

Leg ups that have helped them become better at software engineering? Sure. Leg ups in the form of insider connections / elites? Less so, particularly for talent they're hiring today.

This is a significantly more meritocratic industry than most, and within the industry, the FANGs are relatively more meritocratic than smaller startups.


Meritocratic in the most boring uninteresting sense of the term where you literally only work on things in a way which Google tells you will help advance you up their ladders while not actually working on anything in a way you would find interesting in general.

Google used to run “how to interview with Google” multiple day bootcamps where they’d train you to pass its interviews. I’ve gotten couple of those candidates and thought it was just ridiculous.

On the contrary; the fact that Google recommends them indicates the process is not flawed.

That is too say, their interviewing process explores things at sufficient depth and breadth to make knowing "the" solution not a deciding factor.


Maybe the sort of person who has the time/desire/interest to study a guide to figure out how to fit in at google is exactly the sort of person they want to hire.

Yeah, I don’t think my whiteboarding skills are particularly useful at work, but the personality type that makes me obsessively practice whiteboarding for weeks before a job interview tour also makes me effective at work.

The bikeshedding engineer is exactly what Google or Amazon is looking for. Nothing wrong with that either, I just take issue with the entire industry standardizing around that type of engineer.

My impression was the opposite.

You're given a problem, there are usually a few clear-cut ways to solve it. You're expected to find an optimal or near optimal solution fairly quickly, and explain your approach clearly. That's it.

I've thought the more open-ended, unstructured interviews lead to more bikeshedding opportunities.


Sounds pretty accurate

Well, the (in house) recruiters recommend them. The engineering staff that does the interviewing generally don't get involved in this recommendations.

The recruiter gets paid if/when they push someone through to get hired, which is a very different incentive from what the hiring engineers have.

The real test is how badly you want the job.

learn with the intent to become a better engineer, even if you don't hit the google mark (which is arbitrary anyway), you will become a better engineer. That is the philosophy most people should use when trying to aim for these companies. Don't cram, learn.

I disagree. Doing competitive-style programming and learning all sorts of weird algorithms has not made me a better engineer. I can count on one hand the number of times in my career that I have had to design or use an "interesting" algorithm - and no, not in the "not knowing what you don't know" sense where I could have used one if only I'd known about it.

I'm very critical of this approach to interviewing in general, but it also isn't the same as competitive programming and it isn't focused on "interesting" algorithms. It is far more focused on understanding how to use data structures and the trade offs between them. Chapter 3 of "The Algorithm Design Manual" (which is fittingly titled "Data Structures") is really the most useful reference for the majority of these interviews. I don't think this is the most useful thing for software developers to be good at, but it's definitely useful and worth learning. I was annoyed that I had to study for one of these interviews, but ended up being pleased that it forced me to review this material.

Most algorithms focus on:

1) binary trees for which theres usually a STL

2) some sort of odd string / array manipulation that never comes up in real life

3) some sort of linked list manipulation where again there is an STL for that and/or wouldn’t come up in real life

4) some optimized algorithm that literally took the first person years to discover prims vs kruskals MST for example, but now is expected to “figure out” on the fly.

5) Expects you to solve all these tasks with an insane time pressure that again is not a simulation of real life, it just makes it stressful almost as a rite of passage


I think this might be better mindset to approach it, instead of pure will power. correct or not.

I had a coderpad facebook phone screen last week, interviewer said " that code the won't compile, can you see why?" .. It was missing a semicolon.

on a related note, I write scala on daily basis but scala is not that well suited for coding interviews, I found.


Facebook likes their interview nonsense, but stores passwords in plaintext. I think the credibility of the system is in question

They didn't store pw in plain text. Pw were accidentally captured in logs and those logs were stored in an internal DB. This happens often.

That's because Scala doesn't have nearly enough semicolons.

Kidding aside, why do you feel Scala is unsuited to coding interviews, assuming the interviewer well-understands the language?


hard to write heap sort in functional style with immutable collections. I guess you could write imperative style code.

Well, ah... unless you're applying at Amalgamated SortCo Industries, asking for on-the-spot sorting implementation is pretty high on the list of absurd interview questions.

Anyway I'd think one might use an array and not some other list/collection.


I used clojure once during an interview and bombed, passed again 2 years later at the same company with Python. Would’ve made a big difference in my equity had I just started then!

The amount of time required to 'learn' the Google interview would be time that could be spent learning more universally applicable skills.

Is it true that an experienced developer would not be able to pass the interview without studying using a similar guide? If so, then the interview process is... fubar.


A perfect job interview could be defined as an interview for which the best study technique is to become a better choice for the role. Studying skills that would not directly contribute to job performance would not change the result of a perfect job interview in any way.

In that sense, it's probably a good indicator for Google that the interview advice includes "practice writing code", "make it a habit to validate input", and "learn about data structures", and it's probably a bad indicator for Google that the advice includes "practice writing syntactically correct code on a whiteboard" and "practice solving problems with a 30 minute timer."


I'm in the interviewer pool @ Google.

> practice writing syntactically correct code on a whiteboard

This probably differs from interviewer to interviewer as to how strictly it's adhered to, but it's not really a hard and fast rule. I'm sure there are some interviewers that will ding you on a forgotten semicolon, but I suspect that most would not.

Personally I look for code that isn't so far from syntactically correct that it's clear you are trying to BS me. I'll even accept pseudocode for the most part. But I've had candidates that try to make up language features, and that doesn't fly with me.

> practice solving problems with a 30 minute timer

I only give my candidates 30 minutes. The whole interview is 45, I spend 5 minutes introducing myself and setting up expectations, 30 on the question, and 10 on answering their questions (after all, they're also interviewing us).

You can tell pretty early whether they're on a solid trajectory, and I'll offer the occasional hint to keep someone on track, or ask tangential questions if they're doing well on time. Not finishing isn't a deal killer, provided you had a solid approach and weren't just running in circles. But a good candidate will finish in about 25 minutes and we can spend some time talking about alternate approaches. Sometimes I'll show them the optional approach and see how that conversation goes.

Nine times out of ten a candidate scores low because they overlooked an infinite loop or code would crash on boundary conditions and candidate wasn't able to realize that even with hints.


>Nine times out of ten a candidate scores low because they overlooked an infinite loop or code would crash on boundary conditions and candidate wasn't able to realize that even with hints.

I guess this depends on how you run the interview but one of the things that frustrates me about whiteboard interviews that focus on the coding rather than the design is that I have to step through test cases (especially edge cases) like this manually, which is tedious and not at all how you do things in real life, where you just run the test cases and see what happens instead of having to run your own code on a whiteboard.


My question is about walking around a data structure. Most candidates choose to represent it with arrays.

I agree with stepping through code being lame. If a candidate has tried to do some sort of boundary condition check and mentions a test that'd catch it, then I'll give them a pass. If a candidate just blows past the code without any attempt whatsoever to check that they're in bounds on an array, then I'll ding them for that. You have no idea how often I see code like...

    Node neighbor = data[x+1][y];
... without any check at all to see if x+1 is in bounds for data.

The infinite loop in my question is fairly obvious (because it involves walking around a data structure iteratively). Candidates either see it right away and handle it as they solve, notice it halfway through and crowbar it awkwardly in, or don't notice it until asked to walk through a specific test case, or don't even notice it while walking through the test case.

Ultimately, if the code looks correct to me, or even close (I mess up occasionally) then I'll just ask what cases they would test.


> But I've had candidates that try to make up language features, and that doesn't fly with me.

This piqued my curiosity. Can you give an example?


I'm still floored by how many candidates I've talked to that assert with great confidence that the local variables they declare will still be there with the same values when they call the function recursively.

And most recently, when iterating over a string's characters the underlying string methods KNOW that the string is being iterated and will pick up at the current iteration point. For example, you've got the string "5432112345". And via iteration, you're currently pointing at the first "2" at index 3. The candidate asserted that if you call "indexOf('2')", it would return 6 because "indexOf" knows that it is iterating and should start 1 beyond where it is pointing at.

And my favorite - once had a candidate that misspelled a function name while coding in Ruby. I didn't ding them for it. But I pointed it out because it just bugged me. And the candidate then swore to me that Ruby will automatically call the correct method if there was only 1 candidate based on the misspelling. You know when you do things like "git inti" and it says "did you mean init?". The candidate swore that Ruby would just call "init" for you because it knew what you wanted.


https://github.com/yuki24/did_you_mean#installation :

    Ruby 2.3 and later ships with this gem and it will automatically be required when a Ruby process starts up. No special setup is required.
It doesn't call the method for you, but it does do the did-you-mean automatically if you misspell and it's close enough.

This one candidate had been allowed to skip the phone screen, and phone screens would likely have filtered them out. Candidate used Java and claimed to be deeply familiar with it. Obviously this isn't exact, but the result looked something like this:

    public isTheFooBarred(class Node { int[][] positions, int size } node, int x, int y, Set visited = new HashSet(node.size)) {
      // ...
    }
The first parameter to the method had an inlined class definition, and the last parameter was optional. We discussed this and the candidate claimed that this was legal vanilla Java. In the case of the last parameter, the candidate claimed that the compiler generated every possible combination of methods including and excluding each of the optional parameters (so, 2^n methods).

I really tried to give the candidate the benefit of the doubt, and asked if there was some sort of annotation processor or code pre-processor that they were using, but they were adamant that it was plain old vanilla Java. I'm pretty sure that they were using some in-house built thing, but their solution had significant problems, so it wasn't the only reason I was against the hire.


I once passed a whiteboard interview just calling random made-up operations on generic java arrays, like Array.flatten().

I was relatively new to programming and had been practicing in Java but didn't know how to execute a lot of map/filter/reduce operations off the top of my head like that. I did, however, know that my interviewer was a Python programmer that probably didn't know much about Java language features.


I thought you were allowed to assume functions you needed in the interest of a modular solution. Array.flatten is an obvious “assume I have this, I would write it anyways.”

I think most decent interviews will give you a pass on an enhanced standard library.


When doing an interview at Google in C, I asked if I could assume I had a hashtable implementation with so-and-so interface, and the interviewer said no ¯\_(ツ)_/¯

Ouch. Well, most hashtables in C are custom, I guess.

If you can explain what it is supposed to do, I don't care a whit if you make up a method that probably exists in a standard library somewhere (Guava / Apache Commons, or wherever). If it's slightly more obscure, you might have to write pseudocode to show you know how it might be implemented.

I've had people interview for a Java job who didn't realise strings were immutable.

The question asked was a very short multiple choice; only two of the answers were possible under immutability.


That doesn't sound like a good question at. It sounds like you're looking for "Java programmers" instead of solid engineers.

When I interviewed for my current job using Java, I had been programming in Ruby and JS for the better part of a decade and had to refresh my Java syntax fairly quickly. I know I made some dumb syntax mistakes in my phone interview, like instantiating collections totally incorrectly. I distinctly remember one of my in-person interviewers saying "well, in Java it's boolean, not bool, but sure...". More semantically, I may very well have messed up mutability of collections and primitive conversions and boxing and such.

Enough of my interviewers saw through all of this that I got the job. Now people on my team come to me and say, "hey you're a java guy right?" and ask me questions about this stuff. I wasn't a java expert when I interviewed, but now I am, because that's what my job required, so I learned it. That's what my interviewers were looking for, to the company's benefit.


> Nine times out of ten a candidate scores low because they overlooked an infinite loop or code would crash on boundary conditions

Why does this even matter? They're writing the code on a whiteboard, without being able to compile or debug, in 20 minutes. Does Google really expect code to be correct and production ready in 20 minutes?


> ... and candidate wasn't able to realize that even with hints.

He said why. Why did you misquote?


That is why they were "dinged".

Not why those things matter in whiteboard coding.


The standard solution to my problem is on the order of 12 lines of code. I don't expect it to be perfect or "production ready", but the infinite loop should stick out like a sore thumb. I do expect candidates to demonstrate that they recognized that problem, and to at least make some attempt to check that they didn't overrun the bounds of an array in the 30 minutes they have.

Hard to even get the interview. I've applied a few times as a senior Dev and was rejected based solely on my cv with the feedback of "work on your algorithms"...

Based on their perception of my algorithmic experience from my CV? Google's a weird one.


These guides are just people trying to make a quick buck. There's no shortcut to getting good and perhaps that's why these type of interviews are here to stay.

Amen

It's also a good marketing milking cow to create paid courses and trainings for Google interviews — the dream of so many novice devs and other "Intensive Coding Bootcamp" participants.

Keyword research and trends... https://trends.google.com/trends/explore?date=all&geo=US&q=G...


I hate these style of interviews. I give them to prospective engineers every week for one of these FAANGM companies.

They don't test for good engineers -- they test for people who practice these style interviews, and for good new graduates.

It makes sense to ask these questions to new grads, but afterwards there is so much more experience that I feel like is much more important than acing data structures questions.

I am amazing at whiteboard questions, but that doesn't make me a good engineer. It's because I found the trick to solving these, and have practiced them. A lot of it it is practice 'ooo this looks like a graph problem, let me use a graph', etc.


Should be titled "How to Ace the Technical Interview in the Bay Area". Even absolute shithole bottom tier companies or unknown startups are asking these questions. Write perfect code on the whiteboard or get rejected. You have to put in A LOT of time into preparation even if you don't want to work at Google, which is ridiculous.

> For the phone interview, it will be on a Google document, and for the onsite interview, it will be writing code on a whiteboard.

That part is not completely correct. At onsite can choose to write code in a Chromebook which will have a lightweight editor with syntax highlighting.


It wasn't really a choice when I interviewed there. All but one interviewer had me write code in the chromebook, which was my least favorite part of the interview process. The trackpad didn't respond to my slightly dry erase covered fingers, the keyboard was weird, and the quasi hangout software it was running crashed a few times, one of which required a full restart. That might not sound like a big deal but in a high pressure/time constrained environment it was a bit of a nightmare. I was told going in that the chromebooks would be available as an option, but writing a few lines of code only to be told to stop and switch over to the chromebook (which had either gone to sleep or frozen), have the interviewer log in and select the correct session, then select syntax highlighting, then finally being able to start writing code doesn't seem very conducive to maintaining a train of thought.

Tell your recruiter, interviewers are not supposed to force candidates to use chromebooks.

I interviewed at Google in January, and they offered to let me use a computer if I had accessibility concerns with a whiteboard or really really wanted to, but discouraged it because they found it often made candidates too focused on the syntax of their code and less likely to have a meaningful high-level discussion with the interviewer.

Did they finish rolling them out? I know that is (was?) the plan, but when I interviewed last year at the Santa Monica location, they only had whiteboard.

Indeed not all locations have it, but I believe the major ones, like Santa Monica, should have it available by now.

It would be nice if an article on how to ace a coding interview did not have incorrect code in it. AFAICT the set-based algorithm for finding duplicates is wrong; the resulting set will contain items in the list that are not duplicated.

It doesn't work. Not just a coding issue either, as the code matches their explanation for this "improved" method.

They lost me when they got into showing how to make the 'dups' function faster. The author definitely either doesn't understand big-O notation or doesn't understand the complexity. Their O(1) implementation is anything but. Likely O(n×log(n)) at best. Also, their brute force implementation is unnecessarily verbose. Want dups?

  from collections import defaultdict
  def dups(seq):
    d = defaultdict(int)
    for x in seq:
      d[x] += 1
    return [k for k, v in d.items() if v > 1]
Assuming Python's defaultdict has O(1) lookup/insertion (which I think it does), this algorithm is a proper O(n) complexity.

Hmm. I don't think they claimed to have an O(1) time solution, just O(1) added space. Which, it is, but only because they're counting on the original array's underlying type having enough bits for their sign flipping. It would be as if you used a more compact type for the array elements, and then allocated another bitmap for the range of numbers.

Of course, once we start optimizing how the original array is stored, we may have exceeded the limits of this problem as a teaching exercise :)

As for time, it does seem to be O(n) to me; can you clarify why you think it's nlogn? It may not be particularly fast in practice when compared to other O(n) approaches like the bitmap, but I don't think the complexity is wrong.

Your solution is nice - it actually gives you more information (how many appearances, not just T/F >1 appearance), but it does require more additional space and isn't necessarily faster. I think the bitmap approach would be nicer if you're ok with using more space; the bitmap is essentially a very easy to find perfect hash function due to the unique input constraints.


They didn't say it uses O(1) complexity:

> Analyzing the above approach, we have an algorithm that takes O(n) time and uses O(1) space.


I've never seen an interview process that HN (and Reddit and Slashdot and ...) didn't trash as "deeply flawed", "biased", "unfair", "unreasonable", etc. At some point, though, a company has to have some sort of process, and by and large what they use works for them.

Question for hiring managers and employers:

In Silicon Valley, tech interviewing has become an arms race between applicants cramming to pass tech screens and interviews, and employers coming up with new routines. Sites like Glassdoor and CareerCup are loaded with interview questions that have appeared in those routines, giving savvy interviewees the opportunity to see the questions on the exam and prepare accordingly.

How do you feel about the existence of these sites, and do they affect how interviews are conducted?


As an interviewer I don't really care. A good candidate doesn't need them, and a poor candidate isn't helped by them. The only thing that's irritating to me is that they actually burn interview questions. Once a question is seen on an external job board it gets banned as an interview question.

Genuine Question: Apart from maybe the money or a nice resume entry, why would you/do you want to?

I don't work at Google (and I don't agree with some of the things the company's decided to do) but have many friends who enjoy working there. From what I hear, Google has an organizational structure that is very favorable for regular engineers. Once you're hired and you put in around a year of work in a team, it's almost trivial to find another team. Engineers also directly evaluate managers and I've heard stories of mid to high level managers crying in bathrooms because of poor reviews from their reports. These factors combine to create an environment where teams are actively working to make engineers happy and content. Compared to many companies where managers make a lot of decisions in a room with no feedback given to or received from engineers, it's heck of a lot better.

>I've heard stories of mid to high level managers crying in bathrooms because of poor reviews from their reports. These factors combine to create an environment where teams are actively working to make engineers happy and content.

Honestly this sounds like a thought experiment.

Do you have any ethical hangups about entering an environment where falling out of your favor can leave someone stress-crying in their place of work and/or about their livelihood.

If so, how much money would it take for you to join the system anyway?

How long would you stay in such a system if you found yourself already in one?

Back in the real world, in a business context, it sounds like an abusive workplace and an untenable system. Like, that obviously can't last forever.


Perhaps using the crying manager example was a bad idea on my part. What I can stand behind is having an org structure that encourages managers and execs to treat their employees well. It sounds like Google has done a better job than most. I'm sure there are managers and engineers crying in private in every big company out there. What I'm trying to say is even though a lot of people like to assume that people work for Google and stay there just for the money, Google probably does some things very well to keep all the talent despite the negative press it gets. And I think a major factor is how empowered a "regular" engineer feels in the company. It sounds like a step up from many other companies in that regard.

There are at least two problems with this:

- "I'm sure there are managers and engineers crying in private in every big company out there": You're not excusing google here, just expanding the range of companies whose apparent behavior is mortifying a couple of people in this thread.

- "Google probably does some things very well to keep all the talent despite the negative press it gets.": probably. They probably do a lot of a/b testing to dial in the compensation/retention ratio they're looking for, or maybe they just heap rewards onto engineers because they can afford it. Be that as it may, some people think that what google's doing is detrimental to society, or at least the problems are bigger than a salary or even a total compensation package should make up for.


I don't see any problem because I agree with what you said. Work shouldn't be so stressful that you cry in private. Google should do more "good" for the world.

I still think Google stands as an attractive workplace for reasons that are not just compensation and resume boost, though.


At some other companies it’s the engineers crying in the bathroom, just sayin

That there are even stories about such things makes me want to stay far, far away from there.

> I've heard stories of mid to high level managers crying in bathrooms because of poor reviews from their reports.

Is this your idea of an engineering nirvana? Does engineering happiness have to come at the expense of manager happiness?


It's not quite my idea of an engineering nirvana, no. I'm not sure where you got that impression.

Compared to many companies where managers make a lot of decisions in a room with no feedback given to or received from engineers, it's heck of a lot better.

A very substantial point actually.


> Apart from maybe the money or a nice resume entry

Return question: Do you have a firm grasp on quite how much money it is?


Unsolicited question: Does every man have his price?

I think so, but it will be hard for us to prove.

Both of those are pretty substantial.

The money is not quite life changing in the Bay Area but it's a significant acceleration towards retirement. I would not mind retiring in a few years, in my early 40s instead of working to 60+.

Those are not insignificant reasons.

There's no way this is actually a genuine question. It's the same reason anybody wants a job anywhere: because they'll eventually become homeless if they don't find a job.

There are many rungs on the ladder between Google and homeless.

Is it fair to assume "professional google interviewee" is a thing nowadays?

I've been trying to prepare and it makes me want to bang my head against a wall.

Prepping while working is draining.


There's an unconscious bias in this post. Google as a company is not only interviewing engineering roles. Even for engineering roles, there are too many sub categories and many doesn't follow the typical SWE interview process.

If you just want to know what the interview for your roles would be, the recruiter from Google will happy to give you an overview.


Seems like there are lot of sites like this in past few years. Great.

Isn't the interview different depending on the job?

My immediate reaction to the title of the article was "... for SWEs and possibly SREs".

But to answer your question, yes, absolutely.


Any ideas what the process would look like for something like Solution Architect for Google Cloud? I can’t imagine that there would be graphs and tree coding questions but you never know.

I think it largely depends on the person interviewing you. I know candidates interviewing for Machine Learning positions get asked with typical algorithm questions.

You can bypass the whole charade by knowing 2-3 people within Google that can provide "assurance" you are good enough. Whiteboard testing is for grunts/unknowns without network. Another way is to be a significant contributor to some popular open source project.

AFAIK internal referrals at best skip the phone screen interview.

As for popular open-source projects, I don't know about Guido van Rossum or Rob Pike, but Max Howell definitely whiteboarded in his interview: https://twitter.com/mxcl/status/608682016205344768


> You can bypass the whole charade by knowing 2-3 people within Google that can provide "assurance" you are good enough.

This is not a thing. Everyone goes through the same level-adjusted loop.

> Another way is to be a significant contributor to some popular open source project.

LOL no. Google is literally famous for rejecting major open source contributors for not knowing how to reverse a binary tree.


If they really want you, they do all they can to get you on board even if you are failing/aren't motivated to do their entrance interviews and what I mentioned are two such criteria they use to bypass their process for very senior hires. They would literally give you $3M signing bonus in extreme cases or acquihire your team/company if you still didn't want to move.

They probably didn't want Brew author that much; we don't know the details.


Is there something special about reversing a binary tree?

AFAIK, you could swap the child pointers and do that recursively for the child nodes. You could also do things O(1) by just changing the comparison function, perhaps by wrapping it to negate the comparison.


This is a specific, well-known case. Max Howell (the author of Brew) was rejected by Google. One of his interviewers asked him to invert a binary tree.

https://twitter.com/mxcl/status/608682016205344768?lang=en


How does this actually work?

Ask your recruiter if they offer you that option. If they do, you are ranked quite high already.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: