Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wish companies didn't think their interviews were some weird "secret sauce" -- I've had companies flat out refuse to tell me what the interview process would be like, or even how long I should expect for the interview beyond "1-4 hours". Well, you know 1 hour is a lot different than 4 hours!

Personally I think all companies should be using pair-programming or contract to hire, as I think trivia questions and whiteboarding are worse than useless. I'm happy to do take home work for a company I'm really interested in, but it does feel a little unfair and like a bit of a waste of time overall.



I think a part of the secrecy is so that if it goes poorly they can stop the interview after a really bad session without indicating to the candidate that it's gone so badly. "Yup, it's always just two sessions, goodbye." This is an actual rational I've heard. I don't agree with it, but it's a thing.


For anyone who finds this to be a plausible reason, I did too. Then I realized that a more transparent company would simply tell me that I they're cutting the interview short.


There's a comfort advantage to both parties if you do rejection over phone/email instead of in your office, in my experience.

Also, I've seen interviews cut short for both extremes (going really well and going really poorly) so it's easy to give the wrong impression that way.


That's an awful candidate experience and misses an opportunity to get more interviewers calibrated :(


I'd rather wish for being open, honest and direct with the rejection then tolerant and polite. It saves a lot of time and emotional investment.


As an employer, I have found that if you honestly tell 10 people the reason that you did not hire them that a minimum of 3 people will have hostile reactions to you.

I do agree that it would be a better way for all employees to give feedback to all candidates and for those candidates to handle the rejection in an emotionally mature way.


Most companies do not handle candidate review in an emotionally mature or professional way, and the widespread lack of feedback is one of the ways they secure protection from possible downsides of their own blatantly disrespectful and sometimes outright discriminatory hiring practices.

Assuming that you are like the other 90% or so of employers who treat employees and candidates exceptionally poorly (I'm not saying you are, just assuming for a second), then getting only a 30% rate of feedback calling you out on use of superficial, inconsistent, and (in some cases) borderline (or even blatantly) illegal reasons for rejecting a candidate doesn't sound so bad.

It suggests to me that candidates on the whole are more professional than companies, but companies can use certain bureaucratic policies and processes to make sure they effectively hide the unprofessionalism from ways in which it could come back to negatively affect them.


I'm sure it's not for no reason. Is it difficult to imagine a (particularly bad) candidate throwing a miniature temper tantrum if s/he knew that leaving before lunch was a rejection?


> candidate throwing a miniature temper tantrum

This happens?


In my previous company, one candidate became belligerent and physically pushed interviewer away from whiteboard after interviewer told him that his solution is not correct. Interviewer asked candidate to wait a few minutes, left room, called recruiter who in turn called security to escort candidate out. My desk was just outside interviewing room and I saw the long glance that the guy gave to interviewer - it was "if i see you again, I will kill you". Or at least I interpreted it that way.

Company (and interviewer) decided not to bring charges so I assume he continues to interview at other places. There are non-zero number of unstable individuals who do not handle rejections or criticism well.


And the problem here is that there is no recourse for the company to help stop this individual from being the same problem for somebody else.


Sure beats finding out after you hire the psychopath.


"Trivia questions" are not useless. I've put out job ads only to receive hundreds of applications from all over the world, many not even in English or without even a word accompanying a CV. Of the "proper" applicants, I've then had to deal with dozens of candidates that claimed a competence they did not have. And guess what - that's a lot of effort and time which is taken away from the business. In my case, that's a big opportunity cost with salaries burning a deeper hole for every minute that they are not offset by client time. I suspect it's much worse for famous organisations.

So now I ask a dozen questions by email that should take 5 minutes to answer in order to filter out both these groups, and that cover many basic CS concepts. Works really well, even if it's a bit annoying for candidates and probably turns off many vs just uploading your CV for HR to auto-scan for keywords.

For example, if I'm hiring someone to do data modelling work and they tell me it's called the relational model because of the relations between tables, it's unlikely they'll know how to unfold EAV antipatterns. And if you have actual machine learning experience, it's unlikely you'll be stumped by my asking what high leverage means.


Have "actual machine learning" experience, no idea what "high leverage" means.


I'm referring to leverage in statistics [1], and particularly how high leverage observations may affect your model and how people deal with them. Clearer?

For context: one of the candidates I interviewed told me that the best way to pick a model was to pick the model that would have the highest R squared when fitted to the whole dataset. I asked him about overfitting and he didn't know what I was talking about. Same guy whose CV showed 4 years in a research lab doing stats [2]. A lot of people are just going through the motions.

[1] https://en.wikipedia.org/wiki/Leverage_(statistics)

[2] reposting for fun: http://4.bp.blogspot.com/-IJo_Tkw95-o/VANnTdCFgHI/AAAAAAAADV...


Yeah... Have actual machine learning experience as well, never heard the term leverage as applied here. Sometimes the nomenclature can confuse people, and I guess the original point about more in depth interviewing processes was related to discerning whether the fundamental knowledge and team fit are there, not whether the candidate is aware of the particular subset of CS trivia the interviewer is currently interested in. I say this while admiting that I've made this mistake in the past while interviewing candidates for CS positions...


What do you call high leverage observations then?


This is similar to "multicollinearity". I have two graduate degrees in applied statistics and machine learning, and did math as an undergrad, but I did not ever even hear the term "multicollinearity" until I got my first job after grad school in a place that used a lot of regression models.

I had studied "spurious correlation" in some machine learning courses and in some research, but almost all of the methods I ever studied or worked on were meant to use data-driven methods to account for spurious correlation, or dimensionality reduction to find subspaces in which the data's natural structure was best preserved without correlation-caused redundancy. Literally none of them ever even mentioned the phrase "multicollinearity" -- which seems to be more popular to people coming from classical statistics or econometrics programs.

If I had been asked to describe "what is multicollinearity" during the interviews, I would have been rejected, yet after they told it to me, I learned about it in about 2 hours on Wikipedia, and within two months I had actually done a research project for them where I showed them how you could use ISOMAP or randomized PCA to effectively handle regression multicollinearity better than their ad hoc covariate-averaging techniques.

This kind of trivia stuff is useless for hiring. All it does is let the interviewer feel smart and tout their favorite particular buzzwords to see if the candidate is in the same "club" as them.

I can imagine some diehard frequentist caring a shit ton about "consistent estimator" or various special tests like Kolmogorov-Smirnoff or F-test or likelihood ratio test. If you say, "I always do Bayesian stats... those test thingies are stupid junk" then it means "This person is not in the Frequentist Club, let's reject their lame ass and tell them it's cause they are 'just not a good fit.'"

Same would be true in reverse if some Bayesian diehard interviews a frequentist person.

Same for someone who sees deep learning as the hammer for every nail. Etc. etc.

People don't want to admit they treat these things kind of like baseball trading cards, and it's more about your cool-kid status than your actual tech skill.


However, if you ping me on my profile email after I put up a job on Who's Hiring, and I send you back an email saying "hey, can you answer these few filter questions, take your time" and one of them is "what is multicollinearity", chances are you'll google it and tell me what you typed above and it will be fine. It's not like you can't use Google or take your time.

What I'm dealing with is people who are dumping their CV and a standard cover letter on me at high frequency and I do not want to put an HR filter between them and me, because I want to spot the guy without a LinkedIn profile (HR red flag), who dropped out of college (so no brand - HR red flag) but has plenty of useful experience (so no buzzwords, because real experience looks unremarkable on a CV - HR red flag). I know no more efficient way. Would love to hear of any.

I did make the mistake many years ago of rejecting a PhD in stats because she admitted not knowing about neural nets ("but my lab buddy is doing some research in them"). I'm still kicking myself, but have learnt a bit about hiring since then.


By way of example, I do not have a LinkedIn profile, and I left a Ph.D. program with a master's only. Though my degrees are from pestigious schools, I don't think school status should count for as much as it does.

I am lucky that the prestige of my schools is high, that you can easily Google some of the actual research work I did, and that I am very highly ranked on Stack Overflow.

If I didn't have those, I think more employers would simply reject me for not having a LinkedIn account, because employers use the dumbest shit (like LinkedIn) as hiring cues.


Ah we usually call them outliers and use robust techniques or not-L2 stats when they are pervasive in the data. Still, in my experience, handling outliers is an application dependent thing, as they may be benign or they may suggest another stat model entirely.

Otoh, I agree not being aware of over fitting is definitely grounds for setting aside a candidate for ML job.


There's a difference. Imagine a simple one dimensional linear regression.

A value with x close to the mean, but extremely high y would barely move the line as a whole up or down as you vary y. The gradient would not be much affected.

A value with extremely high x does not need to move much to change the gradient of the line. You might even change the sign of the gradient.

Both are outliers but the second has higher leverage than the first.

Edit: illustration: http://imgur.com/vVKwjHh (right hand side are high leverage)


Do you mind if I ask you where you work? I find most people practising "machine learning" have barely gotten past kmeans, naive bayes, and SVMs with very little actual understanding on how or why they work and where they fall apart.


I run my own data consulting firm based in Asia.

Originally, we were going to build high dimensional statistical learning systems crunching large datasets on GPUs in Haskell.

90% of clients cannot feed me the data, so I end up building them a data warehouse first; in some cases we even redo their data model. 100% of clients have datasets too small to justify getting out of R and single CPUs, so our Haskell ML libraries are, so far, on the wishlist only (I think Tweag has had more success there, but I still have feeble hope that one day...) Hence the RM focus as well as ML. The ML side is sort of picking up this month.

That being said, I'm not actually that well versed in ML. Just the guy running the company. I've built (and read) just enough to know how it works, how it applies to clients and who to hire to get the job done properly...


I think what maybe tripping people here is your use of the word machine learning. Personally, as you're using it is how it should actually be used: applied statistics. However, a lot of people think of machine learning as a collection of algorithms they can use to make predictions about some given dataset. Use sklearn to model.fit(xtrain, ytrain) and model.train(xtest, ytest). I personally blame Coursera and the online courses for this trend.


Do you ever have the impression that the model you are going to apply influences how you clean the data? What happens then when you redo the model?


Interesting, I guess Leverage is one of those terms that one might not encounter frequently except in some specific situations

I don't claim to be an expert but at least I know what Moment is


There are an infinite number of such concepts in "machine learning." Just because you liked this one doesn't make it a good interview question. If anything, you are failing at theory of mind - as evidenced here, most people aren't making lots of use of the concept. When interviewing, don't be a dick. Test people for general ability and willingness to learn, not stupid questions you got out of a random textbook.


Everyone learns different vocab depending on what discipline you learned about stats/ml. Leverage is a very standard stats term, but I've never seen it used in an ml context.


I've only given a couple of hiring interviews so far in my short career, but I've found specific, factual questions to be vital insofar as they truthfully clarify candidates' knowledge levels in their specific areas of expertise. People will generally say 'yes' when asked if they are familiar with something during an interview, even if their knowledge is tenuous, and as you said, resumes are not always totally accurate. I understand the negative reactions these types of questions get online - it's easy to imagine oneself forgetting something in the heat of the moment or missing out on an opportunity for a silly reason - but a qualified candidate should be able to give a plausible answer or ask a cogent clarifying question. If anyone has a better method for filtering out the noise people employ to pitch themselves, I'd be happy to hear it, but I haven't seen one yet.


A long time ago I was interviewing with Facebook. They told me exactly what to expect from the process. I was actually shocked that they were giving away so much information. I studied for the interview and did very well in the first half, and then horribly the second half. Overall I enjoyed it because of how honest they were.


I had a similar experience with Facebook although I did well until the last one. Then I was just too tired and hadn't practiced enough to do a good job. After the fact, I read more of the interview questions on Glassdoor and wished I'd read them before and practiced those on a whiteboard at home.

So that is what I'll do next time I want to interview at FB. Overall, I really enjoyed the process as an interviewee. I've been on both sides and I'm curious how well the other side is done at FB but I left with a good impression and thought the outcome was fair.

Also bonus points to them for asking real world questions. I was asked about making a web-based text editor and a couple months later, Draft.js came out.


I had a different experience with Facebook. I was told what the technical phone interview would consist of, and that much was true, but I was also told I would receive feedback within a certain time frame.

After my interview (in which I felt I did very well), I did not hear anything back within the time frame they told me, and then many days afterwards I finally heard back.

They said they were not going to move forward with me, and also they specifically said they were sorry but they could not provide any feedback.

So I actually felt like for one of the most important parts (the feedback) they were actually not straightforward or honest with me. Had I known I could not have feedback, I would not have agreed to the interview in the first place.


98% of interviews wont tell you why you got rejected. So do you not interview anywhere?


I do refuse to complete any form of test or assessment if they will not guarantee detailed feedback ahead of time. I do reject a lot of employers very early on.

98% of employers treat employees and candidates like shit, so this is a useful and efficient rejection rate if you refuse to work anywhere but the 2% that actually treat you like a human being in addition to paying fairly, offering reasonable benefits, etc.


I just wish companies wouldn't lie

(Why yes, today, I did just talk to a YC company that had "remote" in their job ad on their site but, when asked, was unwilling to do wfh 2 days a week for someone who lives in the peninsula and was unenthusiastic about driving in and out of sf every day. Hmm...)


> Personally I think all companies should be using pair-programming or contract to hire

You should qualify that with "all companies by which I want to be hired", because not all teams/developers pair program or want to pair program.

The team I'm on does a phone tech interview, then an in-person interview where we dig into history, then a mix of problem solving, pseudocode problems, OO/schema design, and web app design/technologies/approach, and then more interviews with the extended team. While we will sit with each other to help out at times, sometimes for hours, we don't strictly pair- that is just when help is needed. We rely on some up-front design and all PR's code reviewed for quality control, in addition to QA and approval by application owner.


I've never had this experience. Is this a Bay Area thing? In my experience companies are usually upfront about what their process is, if indeed they have a process at all.


1-4 hours makes sense. If you're terrible the interview finishes in an hour, 4 hours if there seriously considering you.


That could be because they are inexperienced, and are making up their interview process as they go along? This is definitely quite common IME.


If they say 1-4 hours then 1 hour usually means you aren't getting the job. 3-4 hours means you are still in the running, but still probably won't get the job.

I am with you on contract-to-hire. I get that companies are cautious because a bad hire can be tough to recover from, but the processes 99% of companies do are not conducive to good hires.


This. I hired a few times and rapidly learned not to promise the candidate a fixed amount of time. Why waste everyone's time doing more interviewing if the first interviewer finds out something that reveals the candidate is not a fit?


Because if they took the time to prepare for the interview and budget for an upper bound on the time range, then it is only professional and courteous of you to give them a full review.

If you reject them based on their performance during hour 1, and you don't even give them the chance to show you more about themselves during hours 2, 3, and 4, it sounds like (a) your hiring process is kind of dumb and you're way too focused on what you seem to think is hyper efficient time usage than on actually learning about a candidate, and (b) you come off as unprofessional.

It's a two-way street. Inviting someone in for a "1-4" hour block, then dismissing them after hour 1 makes it seem like you view it as if you're entitled to paw around the candidate like they are a cut of meat, assess them however you want, and not give them a fair chance to assess you.


> Because if they took the time to prepare for the interview...

That's very charitable of you.

I dont know if it was my inability to phone screen properly, my inadequate job posting, my desperation to hire at the salary my boss allowed me, or my optimism, but I had far too many candidates that didn't pass even the above low bar. I don't even want to talk about the candidates that didn't make it through the phone screen.

If it matters, the timeframe wasn't 1-4 hours, but rather 1-2, and the position was junior.


I am not able to understand which parts of your comment are sarcastic and which aren't.

Are you saying you assume candidates did not prepare? If so, then I think you clearly have a problem with how you perceive candidates and you should work on that instead of dismissing candidates so quickly and believing it's their fault.

Even a candidate who performs badly in the interview may have spent a ton of time preparing. They may just honestly not be the right choice for the role, due to technical mismatch, even though they have an excellent work ethic, they studied, they had their friend come over and role play interview questions, they couldn't sleep the night before worrying if the cheap suit they borrowed from their uncle is going to make them look bad, etc. etc.

You should basically view literally every candidate as a person who is trying their best. If that turns out to be false, oh well -- it's not your business to care about forming opinions about candidates who you think weren't giving their best. Just assume they were, that it didn't work out, and move on.

Again, I may not be reading your comment right. But if you feel it's "charitable" to assume any given candidate sitting in front of you studied hard, cares about the job, works hard, and really wants to present themselves well, that's just nuts.


When I was on the company side of it we did 4 interviews. If The first two said no there was no point in wasting the time on the second two, but #1 would always be the 'handler' for the process so if his was a no and he got the signal from #2 that it was also a no he would tell the candidate that the process was over for today. If either voted yes it went to #3. You had to get 3/4 to continue to the offer phase.


Agreed. As someone in the interview process once again, and being a bit more introspective this time, it's just really amazing how BS the entire whiteboard/share-scree "code this out for me" is. It seems to be a better indicator of how many of type X problems you've seen before, and employers seem to care more about if you know how to do something cold vs work through a problem to an adequate solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: