
FIT2086 Week 7
Authored by Bisan Salibi
Information Technology (IT)
University
Used 6+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the primary objective when fitting a linear regression model using the least squares method?
To maximize the correlation between the independent variables.
To minimize the number of predictors in the model.
To minimize the sum of the squares of the residuals
To maximize the R-squared value
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following scenarios is indicative of an overfitting model?
The model performs well on the training data but poorly on new, unseen data
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What does a lower AIC value indicate when comparing multiple models?
The model is simpler and underfits the data.
The model has a better fit to the data with a trade-off for complexity.
The model is less likely to include irrelevant predictors.
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
You perform backward selection on a dataset using the following R command:
final_model <- step(lm(y ~ x1 + x2 + x3 + x4 + x5), direction = "backward")
Afterward, you observe that the final model includes only x1 and x4. Which of the following statements is most accurate about this final model?
The predictors x2, x3, and x5 were removed because their inclusion did not significantly reduce the AIC.
The predictors x2, x3, and x5 were removed because they were likely correlated with y.
The model's performance on the training data will improve, but it may not generalize well to new data.
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
After performing stepwise regression, you notice the final model has an AIC close to the original model, but the number of predictors is reduced by half. Which of the following is a valid interpretation?
The reduced model likely has higher predictive accuracy due to fewer predictors.
The original model was overfitting the data, and the reduced model corrects this by removing noise.
The final model is likely over-simplified, as it removed predictors that might still be relevant.
The reduced model has a better trade-off between complexity and fit, which could improve its generalizability.
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
During model selection, you perform a t-test for the hypothesis H0: βj = 0 for each predictor. Which of the following scenarios would suggest that xj should be removed from the model?
The p-value for βj is greater than the significance level,
The p-value for βj is less than the significance level.
The predictor xj has a low correlation with the response variable y.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
In a binary classification problem, if the ROC curve for a logistic regression model is a diagonal line from (0,0) to (1,1), what does this imply about the model's performance?
The model is no better than random guessing.
The model has perfect predictive ability.
The model has a high true positive rate.
The model's predictions are perfectly calibrated.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?