How is it that an ML estimator might not be unique or consistent?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
1
down vote

favorite












Christian H Weiss says that:




In general, it is not clear if the ML estimators (uniquely) exist and if they are consistent.




Can someone explain what he means? Do we not generally know the shape of a log-likelihood function once we specify the probability distribution?










share|cite|improve this question



























    up vote
    1
    down vote

    favorite












    Christian H Weiss says that:




    In general, it is not clear if the ML estimators (uniquely) exist and if they are consistent.




    Can someone explain what he means? Do we not generally know the shape of a log-likelihood function once we specify the probability distribution?










    share|cite|improve this question























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      Christian H Weiss says that:




      In general, it is not clear if the ML estimators (uniquely) exist and if they are consistent.




      Can someone explain what he means? Do we not generally know the shape of a log-likelihood function once we specify the probability distribution?










      share|cite|improve this question













      Christian H Weiss says that:




      In general, it is not clear if the ML estimators (uniquely) exist and if they are consistent.




      Can someone explain what he means? Do we not generally know the shape of a log-likelihood function once we specify the probability distribution?







      estimation maximum-likelihood consistency






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked 1 hour ago









      Vykta Wakandigara

      204




      204




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          2
          down vote













          A multimodal likelihood function can have two modes of exactly the same value. In this case, the MLE may not be unique as there may two possible estimators that can be constructed by using the equation $partial l(theta; x) /partial theta = 0$.



          Example of such a likelihood from Wikipedia:



          Multimodal likelihood



          Here, see that there's no unique value of $theta$ that maximises the likelihood. The Wikipedia link also gives some conditions on the existence of unique and consistent MLEs although, I believe there are more (a more comprehensive literature search would guide you well).



          Edit: This link about MLEs, which I believe are lecture notes from Cambridge, lists a few more regularity conditions for the MLE to exist.



          You can find examples of inconsistent ML estimators in this CV question.






          share|cite|improve this answer





























            up vote
            1
            down vote













            One example arises from rank deficiency. Suppose that you're conducting an OLS regression but your design matrix is not full rank. In this case, there are any number of solutions which obtain the maximum likelihood value.



            Another case arises in the MLE for binary logistic regression. Suppose that the regression exhibits separation; in this case, the likelihood does not have a well-defined maximum, in the sense that arbitrarily large coefficients monotonically increase the likelihood.






            share|cite|improve this answer




















              Your Answer




              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "65"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              convertImagesToLinks: false,
              noModals: false,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













               

              draft saved


              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f371712%2fhow-is-it-that-an-ml-estimator-might-not-be-unique-or-consistent%23new-answer', 'question_page');

              );

              Post as a guest






























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              2
              down vote













              A multimodal likelihood function can have two modes of exactly the same value. In this case, the MLE may not be unique as there may two possible estimators that can be constructed by using the equation $partial l(theta; x) /partial theta = 0$.



              Example of such a likelihood from Wikipedia:



              Multimodal likelihood



              Here, see that there's no unique value of $theta$ that maximises the likelihood. The Wikipedia link also gives some conditions on the existence of unique and consistent MLEs although, I believe there are more (a more comprehensive literature search would guide you well).



              Edit: This link about MLEs, which I believe are lecture notes from Cambridge, lists a few more regularity conditions for the MLE to exist.



              You can find examples of inconsistent ML estimators in this CV question.






              share|cite|improve this answer


























                up vote
                2
                down vote













                A multimodal likelihood function can have two modes of exactly the same value. In this case, the MLE may not be unique as there may two possible estimators that can be constructed by using the equation $partial l(theta; x) /partial theta = 0$.



                Example of such a likelihood from Wikipedia:



                Multimodal likelihood



                Here, see that there's no unique value of $theta$ that maximises the likelihood. The Wikipedia link also gives some conditions on the existence of unique and consistent MLEs although, I believe there are more (a more comprehensive literature search would guide you well).



                Edit: This link about MLEs, which I believe are lecture notes from Cambridge, lists a few more regularity conditions for the MLE to exist.



                You can find examples of inconsistent ML estimators in this CV question.






                share|cite|improve this answer
























                  up vote
                  2
                  down vote










                  up vote
                  2
                  down vote









                  A multimodal likelihood function can have two modes of exactly the same value. In this case, the MLE may not be unique as there may two possible estimators that can be constructed by using the equation $partial l(theta; x) /partial theta = 0$.



                  Example of such a likelihood from Wikipedia:



                  Multimodal likelihood



                  Here, see that there's no unique value of $theta$ that maximises the likelihood. The Wikipedia link also gives some conditions on the existence of unique and consistent MLEs although, I believe there are more (a more comprehensive literature search would guide you well).



                  Edit: This link about MLEs, which I believe are lecture notes from Cambridge, lists a few more regularity conditions for the MLE to exist.



                  You can find examples of inconsistent ML estimators in this CV question.






                  share|cite|improve this answer














                  A multimodal likelihood function can have two modes of exactly the same value. In this case, the MLE may not be unique as there may two possible estimators that can be constructed by using the equation $partial l(theta; x) /partial theta = 0$.



                  Example of such a likelihood from Wikipedia:



                  Multimodal likelihood



                  Here, see that there's no unique value of $theta$ that maximises the likelihood. The Wikipedia link also gives some conditions on the existence of unique and consistent MLEs although, I believe there are more (a more comprehensive literature search would guide you well).



                  Edit: This link about MLEs, which I believe are lecture notes from Cambridge, lists a few more regularity conditions for the MLE to exist.



                  You can find examples of inconsistent ML estimators in this CV question.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited 1 hour ago

























                  answered 1 hour ago









                  InfProbSciX

                  3298




                  3298






















                      up vote
                      1
                      down vote













                      One example arises from rank deficiency. Suppose that you're conducting an OLS regression but your design matrix is not full rank. In this case, there are any number of solutions which obtain the maximum likelihood value.



                      Another case arises in the MLE for binary logistic regression. Suppose that the regression exhibits separation; in this case, the likelihood does not have a well-defined maximum, in the sense that arbitrarily large coefficients monotonically increase the likelihood.






                      share|cite|improve this answer
























                        up vote
                        1
                        down vote













                        One example arises from rank deficiency. Suppose that you're conducting an OLS regression but your design matrix is not full rank. In this case, there are any number of solutions which obtain the maximum likelihood value.



                        Another case arises in the MLE for binary logistic regression. Suppose that the regression exhibits separation; in this case, the likelihood does not have a well-defined maximum, in the sense that arbitrarily large coefficients monotonically increase the likelihood.






                        share|cite|improve this answer






















                          up vote
                          1
                          down vote










                          up vote
                          1
                          down vote









                          One example arises from rank deficiency. Suppose that you're conducting an OLS regression but your design matrix is not full rank. In this case, there are any number of solutions which obtain the maximum likelihood value.



                          Another case arises in the MLE for binary logistic regression. Suppose that the regression exhibits separation; in this case, the likelihood does not have a well-defined maximum, in the sense that arbitrarily large coefficients monotonically increase the likelihood.






                          share|cite|improve this answer












                          One example arises from rank deficiency. Suppose that you're conducting an OLS regression but your design matrix is not full rank. In this case, there are any number of solutions which obtain the maximum likelihood value.



                          Another case arises in the MLE for binary logistic regression. Suppose that the regression exhibits separation; in this case, the likelihood does not have a well-defined maximum, in the sense that arbitrarily large coefficients monotonically increase the likelihood.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered 1 hour ago









                          Sycorax

                          35.5k693175




                          35.5k693175



























                               

                              draft saved


                              draft discarded















































                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f371712%2fhow-is-it-that-an-ml-estimator-might-not-be-unique-or-consistent%23new-answer', 'question_page');

                              );

                              Post as a guest













































































                              Popular posts from this blog

                              How to check contact read email or not when send email to Individual?

                              Displaying single band from multi-band raster using QGIS

                              How many registers does an x86_64 CPU actually have?