Is Normal(mean, variance) mod x still a normal distribution?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







3












$begingroup$


Is the following distribution a still normal one?



enter image description here



The range of values is constrained to hard limits $ {0, 255} $. And it's generated by $ mathcal{N}(mu, sigma^2) operatorname{mod} 256 + 128^{dagger} $. So it's not that I'm simply truncating a normal, but folding it back into itself. Consequently the probability is never asymptotic to 0.



Normal, or is there a more appropriate term for it?





$dagger$ The +128 is just to centre the peak otherwise it looks silly as $mu = 0$. $sigma sim 12; < 20$. I expect that $mu$ can be ignored for the purposes of classification.



In detail, it's a plot of 128 + (sample1 - sample2) & 0xff. I wouldn't fixate on the centring too much as there's some weird +/- stuff to do with computer networking and transmission of unsigned bytes.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Can you give the values of $mu$ and $sigma^2$? Also, there seems to be something off: the result of a $text{mod} 256$ operation will be between 0 and 256, so the result after adding 128 should be between 126 and 384, unlike your picture. Did you simply not add 128 before plotting?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 13:54












  • $begingroup$
    I don't think it would hold for any $mu$ and $sigma$, and for some paramters it would just like subtracting a value.
    $endgroup$
    – Lerner Zhang
    Apr 21 at 14:00












  • $begingroup$
    Seems related to the wrapped normal distribution also discussed here
    $endgroup$
    – jnez71
    Apr 21 at 18:54


















3












$begingroup$


Is the following distribution a still normal one?



enter image description here



The range of values is constrained to hard limits $ {0, 255} $. And it's generated by $ mathcal{N}(mu, sigma^2) operatorname{mod} 256 + 128^{dagger} $. So it's not that I'm simply truncating a normal, but folding it back into itself. Consequently the probability is never asymptotic to 0.



Normal, or is there a more appropriate term for it?





$dagger$ The +128 is just to centre the peak otherwise it looks silly as $mu = 0$. $sigma sim 12; < 20$. I expect that $mu$ can be ignored for the purposes of classification.



In detail, it's a plot of 128 + (sample1 - sample2) & 0xff. I wouldn't fixate on the centring too much as there's some weird +/- stuff to do with computer networking and transmission of unsigned bytes.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Can you give the values of $mu$ and $sigma^2$? Also, there seems to be something off: the result of a $text{mod} 256$ operation will be between 0 and 256, so the result after adding 128 should be between 126 and 384, unlike your picture. Did you simply not add 128 before plotting?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 13:54












  • $begingroup$
    I don't think it would hold for any $mu$ and $sigma$, and for some paramters it would just like subtracting a value.
    $endgroup$
    – Lerner Zhang
    Apr 21 at 14:00












  • $begingroup$
    Seems related to the wrapped normal distribution also discussed here
    $endgroup$
    – jnez71
    Apr 21 at 18:54














3












3








3


1



$begingroup$


Is the following distribution a still normal one?



enter image description here



The range of values is constrained to hard limits $ {0, 255} $. And it's generated by $ mathcal{N}(mu, sigma^2) operatorname{mod} 256 + 128^{dagger} $. So it's not that I'm simply truncating a normal, but folding it back into itself. Consequently the probability is never asymptotic to 0.



Normal, or is there a more appropriate term for it?





$dagger$ The +128 is just to centre the peak otherwise it looks silly as $mu = 0$. $sigma sim 12; < 20$. I expect that $mu$ can be ignored for the purposes of classification.



In detail, it's a plot of 128 + (sample1 - sample2) & 0xff. I wouldn't fixate on the centring too much as there's some weird +/- stuff to do with computer networking and transmission of unsigned bytes.










share|cite|improve this question











$endgroup$




Is the following distribution a still normal one?



enter image description here



The range of values is constrained to hard limits $ {0, 255} $. And it's generated by $ mathcal{N}(mu, sigma^2) operatorname{mod} 256 + 128^{dagger} $. So it's not that I'm simply truncating a normal, but folding it back into itself. Consequently the probability is never asymptotic to 0.



Normal, or is there a more appropriate term for it?





$dagger$ The +128 is just to centre the peak otherwise it looks silly as $mu = 0$. $sigma sim 12; < 20$. I expect that $mu$ can be ignored for the purposes of classification.



In detail, it's a plot of 128 + (sample1 - sample2) & 0xff. I wouldn't fixate on the centring too much as there's some weird +/- stuff to do with computer networking and transmission of unsigned bytes.







distributions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Apr 21 at 14:21







Paul Uszak

















asked Apr 21 at 13:34









Paul UszakPaul Uszak

199116




199116








  • 1




    $begingroup$
    Can you give the values of $mu$ and $sigma^2$? Also, there seems to be something off: the result of a $text{mod} 256$ operation will be between 0 and 256, so the result after adding 128 should be between 126 and 384, unlike your picture. Did you simply not add 128 before plotting?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 13:54












  • $begingroup$
    I don't think it would hold for any $mu$ and $sigma$, and for some paramters it would just like subtracting a value.
    $endgroup$
    – Lerner Zhang
    Apr 21 at 14:00












  • $begingroup$
    Seems related to the wrapped normal distribution also discussed here
    $endgroup$
    – jnez71
    Apr 21 at 18:54














  • 1




    $begingroup$
    Can you give the values of $mu$ and $sigma^2$? Also, there seems to be something off: the result of a $text{mod} 256$ operation will be between 0 and 256, so the result after adding 128 should be between 126 and 384, unlike your picture. Did you simply not add 128 before plotting?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 13:54












  • $begingroup$
    I don't think it would hold for any $mu$ and $sigma$, and for some paramters it would just like subtracting a value.
    $endgroup$
    – Lerner Zhang
    Apr 21 at 14:00












  • $begingroup$
    Seems related to the wrapped normal distribution also discussed here
    $endgroup$
    – jnez71
    Apr 21 at 18:54








1




1




$begingroup$
Can you give the values of $mu$ and $sigma^2$? Also, there seems to be something off: the result of a $text{mod} 256$ operation will be between 0 and 256, so the result after adding 128 should be between 126 and 384, unlike your picture. Did you simply not add 128 before plotting?
$endgroup$
– Stephan Kolassa
Apr 21 at 13:54






$begingroup$
Can you give the values of $mu$ and $sigma^2$? Also, there seems to be something off: the result of a $text{mod} 256$ operation will be between 0 and 256, so the result after adding 128 should be between 126 and 384, unlike your picture. Did you simply not add 128 before plotting?
$endgroup$
– Stephan Kolassa
Apr 21 at 13:54














$begingroup$
I don't think it would hold for any $mu$ and $sigma$, and for some paramters it would just like subtracting a value.
$endgroup$
– Lerner Zhang
Apr 21 at 14:00






$begingroup$
I don't think it would hold for any $mu$ and $sigma$, and for some paramters it would just like subtracting a value.
$endgroup$
– Lerner Zhang
Apr 21 at 14:00














$begingroup$
Seems related to the wrapped normal distribution also discussed here
$endgroup$
– jnez71
Apr 21 at 18:54




$begingroup$
Seems related to the wrapped normal distribution also discussed here
$endgroup$
– jnez71
Apr 21 at 18:54










1 Answer
1






active

oldest

votes


















4












$begingroup$


  1. No, by definition your distribution is not normal. The normal distribution has unbounded support, yours doesn't. QED.


  2. Often enough, we don't need a normal distribution, only one that is "normal enough". Your distribution may well be normal enough for whatever you want to use it for.



  3. However, I don't think a normal distribution is really the best way to parameterize your data. Specifically, even playing around with the standard deviation, there doesn't seem to be a way to get the kurtosis your data exhibits:



    enter image description here



    Note how we either get all the mass over a much smaller part of the horizontal axis for small SDs, or get much more mass in the center of the distribution than in your picture for large SDs.



    mu <- 128
    ss <- c(5,10,20,40)
    nn <- 1e7

    par(mfrow=c(2,2),mai=c(.5,.5,.5,.1))
    for ( ii in 1:4 ) {
    set.seed(1)
    hist(rnorm(nn,mu,ss[ii])%%256,col="grey",
    freq=FALSE,xlim=c(0,256),xlab="",ylab="",main=paste("SD =",ss[ii]))
    }


    I also tried a $t$ distribution by varying the degrees of freedom, but that didn't look very good, either.




So, if your distribution is not normal enough for your purposes, you may want to look at something like a Pearson type VII distribution. If you truncate this using the modulo operator, it will again not be a Pearson VII strictly speaking, but it may again be "sufficiently Pearson VII".






share|cite|improve this answer









$endgroup$













  • $begingroup$
    I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
    $endgroup$
    – Paul Uszak
    Apr 21 at 17:04








  • 1




    $begingroup$
    Perhaps it could be normal in the wrapped distribution sense of directional statistics?
    $endgroup$
    – jnez71
    Apr 21 at 19:11










  • $begingroup$
    That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 20:05










  • $begingroup$
    My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
    $endgroup$
    – jnez71
    Apr 21 at 21:26












  • $begingroup$
    For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
    $endgroup$
    – jnez71
    Apr 21 at 21:29












Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f404265%2fis-normalmean-variance-mod-x-still-a-normal-distribution%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









4












$begingroup$


  1. No, by definition your distribution is not normal. The normal distribution has unbounded support, yours doesn't. QED.


  2. Often enough, we don't need a normal distribution, only one that is "normal enough". Your distribution may well be normal enough for whatever you want to use it for.



  3. However, I don't think a normal distribution is really the best way to parameterize your data. Specifically, even playing around with the standard deviation, there doesn't seem to be a way to get the kurtosis your data exhibits:



    enter image description here



    Note how we either get all the mass over a much smaller part of the horizontal axis for small SDs, or get much more mass in the center of the distribution than in your picture for large SDs.



    mu <- 128
    ss <- c(5,10,20,40)
    nn <- 1e7

    par(mfrow=c(2,2),mai=c(.5,.5,.5,.1))
    for ( ii in 1:4 ) {
    set.seed(1)
    hist(rnorm(nn,mu,ss[ii])%%256,col="grey",
    freq=FALSE,xlim=c(0,256),xlab="",ylab="",main=paste("SD =",ss[ii]))
    }


    I also tried a $t$ distribution by varying the degrees of freedom, but that didn't look very good, either.




So, if your distribution is not normal enough for your purposes, you may want to look at something like a Pearson type VII distribution. If you truncate this using the modulo operator, it will again not be a Pearson VII strictly speaking, but it may again be "sufficiently Pearson VII".






share|cite|improve this answer









$endgroup$













  • $begingroup$
    I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
    $endgroup$
    – Paul Uszak
    Apr 21 at 17:04








  • 1




    $begingroup$
    Perhaps it could be normal in the wrapped distribution sense of directional statistics?
    $endgroup$
    – jnez71
    Apr 21 at 19:11










  • $begingroup$
    That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 20:05










  • $begingroup$
    My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
    $endgroup$
    – jnez71
    Apr 21 at 21:26












  • $begingroup$
    For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
    $endgroup$
    – jnez71
    Apr 21 at 21:29
















4












$begingroup$


  1. No, by definition your distribution is not normal. The normal distribution has unbounded support, yours doesn't. QED.


  2. Often enough, we don't need a normal distribution, only one that is "normal enough". Your distribution may well be normal enough for whatever you want to use it for.



  3. However, I don't think a normal distribution is really the best way to parameterize your data. Specifically, even playing around with the standard deviation, there doesn't seem to be a way to get the kurtosis your data exhibits:



    enter image description here



    Note how we either get all the mass over a much smaller part of the horizontal axis for small SDs, or get much more mass in the center of the distribution than in your picture for large SDs.



    mu <- 128
    ss <- c(5,10,20,40)
    nn <- 1e7

    par(mfrow=c(2,2),mai=c(.5,.5,.5,.1))
    for ( ii in 1:4 ) {
    set.seed(1)
    hist(rnorm(nn,mu,ss[ii])%%256,col="grey",
    freq=FALSE,xlim=c(0,256),xlab="",ylab="",main=paste("SD =",ss[ii]))
    }


    I also tried a $t$ distribution by varying the degrees of freedom, but that didn't look very good, either.




So, if your distribution is not normal enough for your purposes, you may want to look at something like a Pearson type VII distribution. If you truncate this using the modulo operator, it will again not be a Pearson VII strictly speaking, but it may again be "sufficiently Pearson VII".






share|cite|improve this answer









$endgroup$













  • $begingroup$
    I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
    $endgroup$
    – Paul Uszak
    Apr 21 at 17:04








  • 1




    $begingroup$
    Perhaps it could be normal in the wrapped distribution sense of directional statistics?
    $endgroup$
    – jnez71
    Apr 21 at 19:11










  • $begingroup$
    That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 20:05










  • $begingroup$
    My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
    $endgroup$
    – jnez71
    Apr 21 at 21:26












  • $begingroup$
    For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
    $endgroup$
    – jnez71
    Apr 21 at 21:29














4












4








4





$begingroup$


  1. No, by definition your distribution is not normal. The normal distribution has unbounded support, yours doesn't. QED.


  2. Often enough, we don't need a normal distribution, only one that is "normal enough". Your distribution may well be normal enough for whatever you want to use it for.



  3. However, I don't think a normal distribution is really the best way to parameterize your data. Specifically, even playing around with the standard deviation, there doesn't seem to be a way to get the kurtosis your data exhibits:



    enter image description here



    Note how we either get all the mass over a much smaller part of the horizontal axis for small SDs, or get much more mass in the center of the distribution than in your picture for large SDs.



    mu <- 128
    ss <- c(5,10,20,40)
    nn <- 1e7

    par(mfrow=c(2,2),mai=c(.5,.5,.5,.1))
    for ( ii in 1:4 ) {
    set.seed(1)
    hist(rnorm(nn,mu,ss[ii])%%256,col="grey",
    freq=FALSE,xlim=c(0,256),xlab="",ylab="",main=paste("SD =",ss[ii]))
    }


    I also tried a $t$ distribution by varying the degrees of freedom, but that didn't look very good, either.




So, if your distribution is not normal enough for your purposes, you may want to look at something like a Pearson type VII distribution. If you truncate this using the modulo operator, it will again not be a Pearson VII strictly speaking, but it may again be "sufficiently Pearson VII".






share|cite|improve this answer









$endgroup$




  1. No, by definition your distribution is not normal. The normal distribution has unbounded support, yours doesn't. QED.


  2. Often enough, we don't need a normal distribution, only one that is "normal enough". Your distribution may well be normal enough for whatever you want to use it for.



  3. However, I don't think a normal distribution is really the best way to parameterize your data. Specifically, even playing around with the standard deviation, there doesn't seem to be a way to get the kurtosis your data exhibits:



    enter image description here



    Note how we either get all the mass over a much smaller part of the horizontal axis for small SDs, or get much more mass in the center of the distribution than in your picture for large SDs.



    mu <- 128
    ss <- c(5,10,20,40)
    nn <- 1e7

    par(mfrow=c(2,2),mai=c(.5,.5,.5,.1))
    for ( ii in 1:4 ) {
    set.seed(1)
    hist(rnorm(nn,mu,ss[ii])%%256,col="grey",
    freq=FALSE,xlim=c(0,256),xlab="",ylab="",main=paste("SD =",ss[ii]))
    }


    I also tried a $t$ distribution by varying the degrees of freedom, but that didn't look very good, either.




So, if your distribution is not normal enough for your purposes, you may want to look at something like a Pearson type VII distribution. If you truncate this using the modulo operator, it will again not be a Pearson VII strictly speaking, but it may again be "sufficiently Pearson VII".







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Apr 21 at 14:19









Stephan KolassaStephan Kolassa

50.6k8103191




50.6k8103191












  • $begingroup$
    I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
    $endgroup$
    – Paul Uszak
    Apr 21 at 17:04








  • 1




    $begingroup$
    Perhaps it could be normal in the wrapped distribution sense of directional statistics?
    $endgroup$
    – jnez71
    Apr 21 at 19:11










  • $begingroup$
    That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 20:05










  • $begingroup$
    My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
    $endgroup$
    – jnez71
    Apr 21 at 21:26












  • $begingroup$
    For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
    $endgroup$
    – jnez71
    Apr 21 at 21:29


















  • $begingroup$
    I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
    $endgroup$
    – Paul Uszak
    Apr 21 at 17:04








  • 1




    $begingroup$
    Perhaps it could be normal in the wrapped distribution sense of directional statistics?
    $endgroup$
    – jnez71
    Apr 21 at 19:11










  • $begingroup$
    That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
    $endgroup$
    – Stephan Kolassa
    Apr 21 at 20:05










  • $begingroup$
    My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
    $endgroup$
    – jnez71
    Apr 21 at 21:26












  • $begingroup$
    For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
    $endgroup$
    – jnez71
    Apr 21 at 21:29
















$begingroup$
I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
$endgroup$
– Paul Uszak
Apr 21 at 17:04






$begingroup$
I called it Normal as I assumed that '(sample - sample)' = Normal, even if 'sample' itself is not Normal. 'sample' actually looks more like log-Normal. Would that explain the kurtosis problem?
$endgroup$
– Paul Uszak
Apr 21 at 17:04






1




1




$begingroup$
Perhaps it could be normal in the wrapped distribution sense of directional statistics?
$endgroup$
– jnez71
Apr 21 at 19:11




$begingroup$
Perhaps it could be normal in the wrapped distribution sense of directional statistics?
$endgroup$
– jnez71
Apr 21 at 19:11












$begingroup$
That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
$endgroup$
– Stephan Kolassa
Apr 21 at 20:05




$begingroup$
That is possible. I'm afraid you have lost me with the sample, and its log-normality. Could you give a few more details?
$endgroup$
– Stephan Kolassa
Apr 21 at 20:05












$begingroup$
My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
$endgroup$
– jnez71
Apr 21 at 21:26






$begingroup$
My understanding of it isn't fantastic (otherwise I'd post an answer) but I think the idea is that when you have a probability space with a sample set that is not R but rather some cyclic group like SO2 or perhaps R mod 256, the mathematically accepted way to generalize any distribution on R to these spaces is to, well, do what the OP did and "identify" (via modulo) various samples as each other, summing the probabilities ad infinitum, like the wiki article describes: p_wrapped(x) = sum_over_all_k_in_N(p(x + k*r)) where r is the mod value (so 256 for the OP)
$endgroup$
– jnez71
Apr 21 at 21:26














$begingroup$
For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
$endgroup$
– jnez71
Apr 21 at 21:29




$begingroup$
For the normal distribution you get this which I suspect retains most of the properties we'd hope for (though I'm not 100% sure). Like perhaps a central limit theorem for this mod space?
$endgroup$
– jnez71
Apr 21 at 21:29


















draft saved

draft discarded




















































Thanks for contributing an answer to Cross Validated!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f404265%2fis-normalmean-variance-mod-x-still-a-normal-distribution%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

He _____ here since 1970 . Answer needed [closed]What does “since he was so high” mean?Meaning of “catch birds for”?How do I ensure “since” takes the meaning I want?“Who cares here” meaningWhat does “right round toward” mean?the time tense (had now been detected)What does the phrase “ring around the roses” mean here?Correct usage of “visited upon”Meaning of “foiled rail sabotage bid”It was the third time I had gone to Rome or It is the third time I had been to Rome

Bunad

Færeyskur hestur Heimild | Tengill | Tilvísanir | LeiðsagnarvalRossið - síða um færeyska hrossið á færeyskuGott ár hjá færeyska hestinum