what is the log of the PDF for a Normal Distribution? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)How to solve/compute for normal distribution and log-normal CDF inverse?Distribution of the convolution of squared normal and chi-squared variables?Cramer's theorem for a precise normal asymptotic distributionConditional Expected Value of Product of Normal and Log-Normal DistributionAsymptotic relation for a class of probability distribution functionsShow that $Y_1+Y_2$ have distribution skew-normalExpected Fisher's information matrix for Student's t-distribution?Expected Value of Maximum likelihood mean for Gaussian DistributionJoint density of the sum of a random and a non-random variable?Reversing conditional distribution

Should a wizard buy fine inks every time he want to copy spells into his spellbook?

How much damage would a cupful of neutron star matter do to the Earth?

Can you force honesty by using the Speak with Dead and Zone of Truth spells together?

NERDTreeMenu Remapping

Getting out of while loop on console

RSA find public exponent

A `coordinate` command ignored

Was Kant an Intuitionist about mathematical objects?

Can two people see the same photon?

How many time has Arya actually used Needle?

My mentor says to set image to Fine instead of RAW — how is this different from JPG?

Why do early math courses focus on the cross sections of a cone and not on other 3D objects?

Sally's older brother

How does light 'choose' between wave and particle behaviour?

The test team as an enemy of development? And how can this be avoided?

Where is the Next Backup Size entry on iOS 12?

Asymptotics question

GDP with Intermediate Production

What does it mean that physics no longer uses mechanical models to describe phenomena?

Why is it faster to reheat something than it is to cook it?

Is CEO the "profession" with the most psychopaths?

Caught masturbating at work

Is multiple magic items in one inherently imbalanced?

Why does electrolysis of aqueous concentrated sodium bromide produce bromine at the anode?



what is the log of the PDF for a Normal Distribution?



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)How to solve/compute for normal distribution and log-normal CDF inverse?Distribution of the convolution of squared normal and chi-squared variables?Cramer's theorem for a precise normal asymptotic distributionConditional Expected Value of Product of Normal and Log-Normal DistributionAsymptotic relation for a class of probability distribution functionsShow that $Y_1+Y_2$ have distribution skew-normalExpected Fisher's information matrix for Student's t-distribution?Expected Value of Maximum likelihood mean for Gaussian DistributionJoint density of the sum of a random and a non-random variable?Reversing conditional distribution



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1












$begingroup$


I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?










share|cite|improve this question









$endgroup$







  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    3 hours ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    2 hours ago

















1












$begingroup$


I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?










share|cite|improve this question









$endgroup$







  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    3 hours ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    2 hours ago













1












1








1





$begingroup$


I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?










share|cite|improve this question









$endgroup$




I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?







probability log






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 3 hours ago









shi95shi95

103




103







  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    3 hours ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    2 hours ago












  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    3 hours ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    2 hours ago







3




3




$begingroup$
Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
$endgroup$
– Artem Mavrin
3 hours ago





$begingroup$
Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
$endgroup$
– Artem Mavrin
3 hours ago













$begingroup$
@ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
$endgroup$
– StatsStudent
2 hours ago




$begingroup$
@ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
$endgroup$
– StatsStudent
2 hours ago










1 Answer
1






active

oldest

votes


















2












$begingroup$

For a single observed value $x$ you have log-likelihood:



$$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



$$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






share|cite|improve this answer









$endgroup$













    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "65"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f404191%2fwhat-is-the-log-of-the-pdf-for-a-normal-distribution%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    For a single observed value $x$ you have log-likelihood:



    $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



    For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



    $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






    share|cite|improve this answer









    $endgroup$

















      2












      $begingroup$

      For a single observed value $x$ you have log-likelihood:



      $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



      For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



      $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






      share|cite|improve this answer









      $endgroup$















        2












        2








        2





        $begingroup$

        For a single observed value $x$ you have log-likelihood:



        $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



        For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



        $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






        share|cite|improve this answer









        $endgroup$



        For a single observed value $x$ you have log-likelihood:



        $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



        For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



        $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 2 hours ago









        BenBen

        28.9k233129




        28.9k233129



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f404191%2fwhat-is-the-log-of-the-pdf-for-a-normal-distribution%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How to create a command for the “strange m” symbol in latex? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)How do you make your own symbol when Detexify fails?Writing bold small caps with mathpazo packageplus-minus symbol with parenthesis around the minus signGreek character in Beamer document titleHow to create dashed right arrow over symbol?Currency symbol: Turkish LiraDouble prec as a single symbol?Plus Sign Too Big; How to Call adfbullet?Is there a TeX macro for three-legged pi?How do I get my integral-like symbol to align like the integral?How to selectively substitute a letter with another symbol representing the same letterHow do I generate a less than symbol and vertical bar that are the same height?

            Българска екзархия Съдържание История | Български екзарси | Вижте също | Външни препратки | Литература | Бележки | НавигацияУстав за управлението на българската екзархия. Цариград, 1870Слово на Ловешкия митрополит Иларион при откриването на Българския народен събор в Цариград на 23. II. 1870 г.Българската правда и гръцката кривда. От С. М. (= Софийски Мелетий). Цариград, 1872Предстоятели на Българската екзархияПодмененият ВеликденИнформационна агенция „Фокус“Димитър Ризов. Българите в техните исторически, етнографически и политически граници (Атлас съдържащ 40 карти). Berlin, Königliche Hoflithographie, Hof-Buch- und -Steindruckerei Wilhelm Greve, 1917Report of the International Commission to Inquire into the Causes and Conduct of the Balkan Wars

            Чепеларе Съдържание География | История | Население | Спортни и природни забележителности | Културни и исторически обекти | Религии | Обществени институции | Известни личности | Редовни събития | Галерия | Източници | Литература | Външни препратки | Навигация41°43′23.99″ с. ш. 24°41′09.99″ и. д. / 41.723333° с. ш. 24.686111° и. д.*ЧепелареЧепеларски Linux fest 2002Начало на Зимен сезон 2005/06Национални хайдушки празници „Капитан Петко Войвода“Град ЧепелареЧепеларе – народният ски курортbgrod.orgwww.terranatura.hit.bgСправка за населението на гр. Исперих, общ. Исперих, обл. РазградМузей на родопския карстМузей на спорта и скитеЧепеларебългарскибългарскианглийскитукИстория на градаСки писти в ЧепелареВремето в ЧепелареРадио и телевизия в ЧепелареЧепеларе мами с родопски чар и добри пистиЕвтин туризъм и снежни атракции в ЧепелареМестоположениеИнформация и снимки от музея на родопския карст3D панорами от ЧепелареЧепелареррр