5 Data Science Interview Questions Part VIII

1. What is the difference between Type I and Type II errors?

Edna Figueira Fernandes
2 min readSep 20, 2020

Type I error is a false positive; when we reject “a true null hypothesis”.

Type II error is a false negative; when we fail to reject “a false null hypothesis”.

2. Are expected value and mean value different?

The concept behind the two is the same, however, they are used in different contexts. The expected value is used in the context of a random variable while the mean value is used in the context of a probability distribution or sample population.

3. Does gradient descent always converge to the global minimum?

Not always. In the case of a convex function (there is only one global minimum), choosing the appropriate learning rate increases the chances of reaching the global minimum.

When the function is non-convex, depending on where we start to apply the gradient descent, we may reach either a local minimum or the global minimum.

4. What is the bias/variance trade-off in data science?

The bias/variance trade-off is the conflict in trying to minimize both of these sources of error in order to create models that can be generalized to other data sets.

Datasets with high bias can prevent the algorithm from learning the true relationships between the features and the target. These algorithms are less complex, however, they tend to underfit the data.

High variance can result in the algorithm learning too much noise from the data. These algorithms are more complex, and they tend to overfit the data.

5. What are recommender systems?

Recommender systems are algorithms that predict the preferences of users for a particular item.

References

--

--