Is it correct to join training and validation set before inferring on test-set?Why use both validation set and test set?Training set, validation set, and test set with OrangeWhy can't I choose my hyper-parameter in the training set?Why would a validation set wear out slower than a test set?Is it correct to use non-target values of test set to engineer new features for train set?Feature Scaling and normalization in cross-validation setOversampling before Cross-Validation, is it a problem?Difference between train, test split before preprocessing and after preprocessingWhat's the point of the test set?Cross validation Vs. Train Validate Test
What is this little owl-like bird?
How do you move up one folder in Finder?
OR-backed serious games
Does the Pole of Angling's command word require an action?
Why doesn't sea level show seasonality?
How can I effectively communicate to recruiters that a phone call is not possible?
How to tell someone I'd like to become friends without letting them think I'm romantically interested in them?
Placing text inside a loop
Word meaning to destroy books
When an electron changes its spin, or any other intrinsic property, is it still the same electron?
Is there any reason why MCU changed the Snap to Blip
Employers keep telling me my college isn't good enough - is there any way to fix this?
Managing and organizing the massively increased number of classes after switching to SOLID?
Swapping "Good" and "Bad"
Graduate student with abysmal English writing skills, how to help
Was I subtly told to resign?
Why does wrapping aluminium foil around my food help it keep warm, even though aluminium is a good conductor?
LED glows slightly during soldering
If your plane is out-of-control, why does military training instruct releasing the joystick to neutralize controls?
Confirming the Identity of a (Friendly) Reviewer After the Reviews
Why do we need common sense in AI?
How to say "How long have you had this dream?"
Optimization terminology: "Exact" v. "Approximate"
Is anyone advocating the promotion of homosexuality in UK schools?
Is it correct to join training and validation set before inferring on test-set?
Why use both validation set and test set?Training set, validation set, and test set with OrangeWhy can't I choose my hyper-parameter in the training set?Why would a validation set wear out slower than a test set?Is it correct to use non-target values of test set to engineer new features for train set?Feature Scaling and normalization in cross-validation setOversampling before Cross-Validation, is it a problem?Difference between train, test split before preprocessing and after preprocessingWhat's the point of the test set?Cross validation Vs. Train Validate Test
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
$begingroup$
I would like to know if is a correct procedure to join training-set
and validation-set
together, in order to train the model on this new dataset, before making predictions on the test-set
.
machine-learning dataset training
$endgroup$
add a comment |
$begingroup$
I would like to know if is a correct procedure to join training-set
and validation-set
together, in order to train the model on this new dataset, before making predictions on the test-set
.
machine-learning dataset training
$endgroup$
add a comment |
$begingroup$
I would like to know if is a correct procedure to join training-set
and validation-set
together, in order to train the model on this new dataset, before making predictions on the test-set
.
machine-learning dataset training
$endgroup$
I would like to know if is a correct procedure to join training-set
and validation-set
together, in order to train the model on this new dataset, before making predictions on the test-set
.
machine-learning dataset training
machine-learning dataset training
asked 9 hours ago


SimoneSimone
2964 silver badges15 bronze badges
2964 silver badges15 bronze badges
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Yes, once you optimized your model and parameters with the validation set, it is advised to train your final model on the combination of the training and validation set before applying it to the test set.
Remember that once you tested your model on the test set, you should not tweak it anymore.
$endgroup$
add a comment |
$begingroup$
In theory, you use the training set to learn the weights, the validation set to adjust the network architecture and the testing set to verify the generalisation of your network. You can find further details on this towards data science article. Only the accuracy on the testing set - unseen during both the training and the architecture tuning - give you an unbiased idea of your performance.
If you want now to use your model on new data - in production for instance - you could re-train it on the training + validation sets.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f55374%2fis-it-correct-to-join-training-and-validation-set-before-inferring-on-test-set%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Yes, once you optimized your model and parameters with the validation set, it is advised to train your final model on the combination of the training and validation set before applying it to the test set.
Remember that once you tested your model on the test set, you should not tweak it anymore.
$endgroup$
add a comment |
$begingroup$
Yes, once you optimized your model and parameters with the validation set, it is advised to train your final model on the combination of the training and validation set before applying it to the test set.
Remember that once you tested your model on the test set, you should not tweak it anymore.
$endgroup$
add a comment |
$begingroup$
Yes, once you optimized your model and parameters with the validation set, it is advised to train your final model on the combination of the training and validation set before applying it to the test set.
Remember that once you tested your model on the test set, you should not tweak it anymore.
$endgroup$
Yes, once you optimized your model and parameters with the validation set, it is advised to train your final model on the combination of the training and validation set before applying it to the test set.
Remember that once you tested your model on the test set, you should not tweak it anymore.
answered 9 hours ago
Alexis PisterAlexis Pister
1848 bronze badges
1848 bronze badges
add a comment |
add a comment |
$begingroup$
In theory, you use the training set to learn the weights, the validation set to adjust the network architecture and the testing set to verify the generalisation of your network. You can find further details on this towards data science article. Only the accuracy on the testing set - unseen during both the training and the architecture tuning - give you an unbiased idea of your performance.
If you want now to use your model on new data - in production for instance - you could re-train it on the training + validation sets.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
$endgroup$
add a comment |
$begingroup$
In theory, you use the training set to learn the weights, the validation set to adjust the network architecture and the testing set to verify the generalisation of your network. You can find further details on this towards data science article. Only the accuracy on the testing set - unseen during both the training and the architecture tuning - give you an unbiased idea of your performance.
If you want now to use your model on new data - in production for instance - you could re-train it on the training + validation sets.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
$endgroup$
add a comment |
$begingroup$
In theory, you use the training set to learn the weights, the validation set to adjust the network architecture and the testing set to verify the generalisation of your network. You can find further details on this towards data science article. Only the accuracy on the testing set - unseen during both the training and the architecture tuning - give you an unbiased idea of your performance.
If you want now to use your model on new data - in production for instance - you could re-train it on the training + validation sets.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
$endgroup$
In theory, you use the training set to learn the weights, the validation set to adjust the network architecture and the testing set to verify the generalisation of your network. You can find further details on this towards data science article. Only the accuracy on the testing set - unseen during both the training and the architecture tuning - give you an unbiased idea of your performance.
If you want now to use your model on new data - in production for instance - you could re-train it on the training + validation sets.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
answered 9 hours ago


Samuel TapSamuel Tap
111 bronze badge
111 bronze badge
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
Samuel Tap is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
add a comment |
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f55374%2fis-it-correct-to-join-training-and-validation-set-before-inferring-on-test-set%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown