{"id":410,"date":"2016-11-01T13:08:50","date_gmt":"2016-11-01T11:08:50","guid":{"rendered":"http:\/\/members.loria.fr\/EColin\/?p=410"},"modified":"2016-11-01T13:10:12","modified_gmt":"2016-11-01T11:10:12","slug":"why-you-should-use-cross-entropy-error-instead-of-classification-error-or-mean-squared-error-for-neural-network-classifier-training-james-d-mccaffrey","status":"publish","type":"post","link":"https:\/\/members.loria.fr\/EColin\/why-you-should-use-cross-entropy-error-instead-of-classification-error-or-mean-squared-error-for-neural-network-classifier-training-james-d-mccaffrey\/","title":{"rendered":"Why You Should Use Cross-Entropy Error Instead Of Classification Error Or Mean Squared Error For Neural Network Classifier Training | James D. McCaffrey"},"content":{"rendered":"<blockquote><p>When using a neural network to perform classification and prediction, it is usually better to use cross-entropy error than classification error, and somewhat better to use cross-entropy error than \u2026<\/p><\/blockquote>\n<p>Source\u00a0: <em><a href=\"https:\/\/jamesmccaffrey.wordpress.com\/2013\/11\/05\/why-you-should-use-cross-entropy-error-instead-of-classification-error-or-mean-squared-error-for-neural-network-classifier-training\/\">Why You Should Use Cross-Entropy Error Instead Of Classification Error Or Mean Squared Error For Neural Network Classifier Training | James D. McCaffrey<\/a><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>When using a neural network to perform classification and prediction, it is usually better to use cross-entropy error than classification error, and somewhat better to use cross-entropy error than \u2026<\/p>\n<p>Source\u00a0: <em><a href=\"https:\/\/jamesmccaffrey.wordpress.com\/2013\/11\/05\/why-you-should-use-cross-entropy-error-instead-of-classification-error-or-mean-squared-error-for-neural-network-classifier-training\/\">Why You Should Use Cross-Entropy Error Instead Of Classification Error Or Mean Squared Error For Neural Network Classifier Training | James D. McCaffrey<\/a><\/em><\/p>\n","protected":false},"author":115,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[19,18],"tags":[],"class_list":["post-410","post","type-post","status-publish","format-standard","hentry","category-deeplearning","category-statistiques"],"_links":{"self":[{"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/posts\/410","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/users\/115"}],"replies":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/comments?post=410"}],"version-history":[{"count":1,"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/posts\/410\/revisions"}],"predecessor-version":[{"id":411,"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/posts\/410\/revisions\/411"}],"wp:attachment":[{"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/media?parent=410"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/categories?post=410"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/members.loria.fr\/EColin\/wp-json\/wp\/v2\/tags?post=410"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}