
Photo by the author
# Entry
Everyone knows what comes up in data science interviews: SQL, Python, machine learning modelsstatistics, sometimes a system design or case study. If it comes up in job interviews, that’s what they check for, right? Not quite. I mean, they certainly test everything I mentioned, but they don’t just test this: there’s a hidden layer behind all the technical tasks that companies actually evaluate.


Author’s photo | Imgflip
It’s almost a distraction: while you think you’re showing off your coding skills, employers are looking at something else.
That something else is the hidden curriculum – the skills that will actually reveal whether you will succeed in this role and in the company.


Author’s photo | AI napkin
# 1. Can you translate business into data (and back)?
This is one of the greatest skills required from data analysts. Employers want to see if they can turn an concealed business problem (e.g., “Who are the most valuable customers?”) into a data science or machine learning model and then translate the findings back into plain language for decision-makers.
What to expect:
- Case studies framed loosely: For example: “Our app’s daily active users are flat. How would you improve engagement?”
- Follow-up questions that force you to justify your analysis: For example: “What metric would you like to track to find out if engagement is improving?”, “Why did you choose this metric over session length or retention?”, “If management is only interested in revenue, how would you redesign your solution?”
What do they actually test:


Author’s photo | AI napkin
- Clarity: Can you explain your comments in plain English, without too many technical terms?
- Prioritization: Can you highlight the main insights and explain why they are critical?
- Audience awareness: Do you change your language depending on your audience (technical or non-technical)?
- Confidence without arrogance: Can you clearly explain your approach without becoming overly defensive?
# 2. Do you understand the tradeoffs?
In your job you will constantly have to make compromises, e.g accuracy and interpretability Or bias and variance. Employers want you to do this in job interviews, too.
What to expect:
- Questions like: “Would you use e.g random forest or logistic regression here?”
- No correct answer: Scenarios where both answers may be correct, but the interviewer is interested in explaining your choice.
What do they actually test:


Author’s photo | AI napkin
- There is no universally “best” model: do you understand that?
- Identifying trade-offs: Can you do it in a uncomplicated way?
- Business Alignment: Do you demonstrate an awareness of adapting your model choices to business needs rather than chasing technical perfection?
# 3. Can you work with imperfect data?
Data sets in interviews are rarely spotless. There are usually missing values, duplicates, and other inconsistencies. This is an intentional reflection of the actual data you will have to work with.
What to expect:
- Imperfect data: tables with inconsistent formats (e.g. dates displaying as 2025/09/19 and 19/09/25), duplicates, hidden gaps (e.g. missing values only in specific time periods, e.g. every weekend), edge cases (e.g. negative quantities in the “items sold” column or customers aged 200 or 0)
- Analytical Reasoning Question: Questions about how to validate assumptions
What do they actually test:


Author’s photo | AI napkin
- Your instinct for data quality: Instead of mindlessly coding, do you stop and question the data?
- Prioritize data cleansing: Do you know which items are worth cleaning first and have the greatest impact on your analysis?
- Evaluate for ambiguity: Are you clearly articulating assumptions so that your analysis is clear and you can proceed with an acknowledgment of risk?
# 4. Do you think in experiments?
Experiments are a huge part of data science. Even if the role isn’t explicitly experimental, you’ll need to do A/B testing, pilots, and validation.
What to expect:
What do they actually test:


Author’s photo | AI napkin
- Your ability to design experiments: Do you clearly define control and treatment, perform randomization, and consider sample size?
- Critically Interpreting Results: When interpreting experimental results, do you consider statistical significance versus practical significance, confidence intervals, and secondary effects?
# 5. Can you remain peaceful in the face of ambiguity?
Most interviews are designed to be inconclusive. Interviewers want to see how you handle imperfect and incomplete information and instructions. Guess what, that’s what you’ll get in your current job.
What to expect:
- Vague questions without context: for example, “How do I measure customer engagement?”
- Continuing with clarifying questions: For example, you could try to clarify the above by asking: “Do we want engagement to be measured by time spent in the meeting or number of sessions?” Then the interviewer might put you in a challenging position by asking, “What would you choose if management didn’t know?”
What do they actually test:


Author’s photo | AI napkin
- Attitude in conditions of uncertainty: do you freeze or remain peaceful and pragmatic?
- Problem structure: Can you impose order on an unclear request?
- Creating assumptions: Do you clearly articulate your assumptions so that they can be challenged and refined in subsequent iterations of the analysis?
- Business Reasoning: Do you tie your assumptions to business goals or to arbitrary guesswork?
# 6. Do you know when “better” is the enemy of “good”?
Employers want you to be pragmatic, which means: can you produce the most useful results as quickly and as simply as possible? A candidate who would spend six months improving model accuracy by 1% isn’t exactly what they’re looking for, to put it mildly.
What to expect:
- Pragmatism question: Can you find a uncomplicated solution that solves 80% of the problem?
- Probing: The interviewer insists that you explain why you will stop there.
What do they actually test:


Author’s photo | AI napkin
- Assessment: Do you know when to stop optimizing?
- Business alignment: Can solutions be combined with business impact?
- Resource awareness: do you respect the team’s time, costs and capabilities?
- Iterative mindset: Do you ship something useful now and improve later, rather than spending too much time developing the “perfect” solution?
# 7. Can you handle rejection?
Data analysis is collaborative and your ideas will be challenged, so job interviews reflect this.
What to expect:
- Critical Reasoning Test: Interviewers try to provoke you and find flaws in your approach
- Fit Test: Questions like, “What if management doesn’t agree?”
What do they actually test:


Author’s photo | AI napkin
- Resilience in check: Do you stay peaceful when your approach is questioned?
- Clarity of Reasoning: Are your thoughts clear to you and can you explain them to others?
- Adaptability: If the interviewer reveals a gap in your approach, how will you respond? Do you admit it gracefully, or do you get offended and run out of the office crying and screaming curses?
# Application
You see, technical interviews aren’t really what you thought they were. Please note that this entire MOT basically consists of:
- Translating business problems
- Managing trade-offs
- Dealing with confused, ambiguous data and situations
- Knowing when to optimize and when to stop
- Cooperation under pressure
Nate Rosidi is a data scientist and product strategist. He is also an adjunct professor of analytics and the founder of StrataScratch, a platform that helps data scientists prepare for job interviews using real interview questions from top companies. Nate writes about the latest career trends, gives interview advice, shares data science projects, and discusses all things SQL.
