Expectation of inverse of sum of positive iid variables
$begingroup$
Let $(X_i)_{i}$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $bar{X}_n = frac{sum_{i=1}^n X_i}{n}$.
My question is: Can we can bound $mathbb{E}(1/bar{X}_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $bar{X}_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbb{E}(f(X)) approx f(mu_X) +frac{f''(mu_X)}{2}sigma_X^2$$
So in my case it would give something like:
$$mathbb{E}(1/bar{X}_n) approx 1 +frac{sigma^2}{4 n}$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbb{P}(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $bar{X}_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
New contributor
$endgroup$
add a comment |
$begingroup$
Let $(X_i)_{i}$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $bar{X}_n = frac{sum_{i=1}^n X_i}{n}$.
My question is: Can we can bound $mathbb{E}(1/bar{X}_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $bar{X}_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbb{E}(f(X)) approx f(mu_X) +frac{f''(mu_X)}{2}sigma_X^2$$
So in my case it would give something like:
$$mathbb{E}(1/bar{X}_n) approx 1 +frac{sigma^2}{4 n}$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbb{P}(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $bar{X}_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
New contributor
$endgroup$
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
add a comment |
$begingroup$
Let $(X_i)_{i}$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $bar{X}_n = frac{sum_{i=1}^n X_i}{n}$.
My question is: Can we can bound $mathbb{E}(1/bar{X}_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $bar{X}_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbb{E}(f(X)) approx f(mu_X) +frac{f''(mu_X)}{2}sigma_X^2$$
So in my case it would give something like:
$$mathbb{E}(1/bar{X}_n) approx 1 +frac{sigma^2}{4 n}$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbb{P}(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $bar{X}_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
New contributor
$endgroup$
Let $(X_i)_{i}$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $bar{X}_n = frac{sum_{i=1}^n X_i}{n}$.
My question is: Can we can bound $mathbb{E}(1/bar{X}_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $bar{X}_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbb{E}(f(X)) approx f(mu_X) +frac{f''(mu_X)}{2}sigma_X^2$$
So in my case it would give something like:
$$mathbb{E}(1/bar{X}_n) approx 1 +frac{sigma^2}{4 n}$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbb{P}(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $bar{X}_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
variance expected-value iid
New contributor
New contributor
edited yesterday
Gopi
New contributor
asked yesterday
GopiGopi
1314
1314
New contributor
New contributor
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
add a comment |
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $bar{X}_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/bar{X}_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperator{E}{mathbb{E}}E 1/bar{X}_n ge 1/E bar{X}_n$.
$endgroup$
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ frac{f''(1)(x-1)^2}{2} + frac{f'''(varepsilon) (x-1)^2}{2}$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $bar{X}_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbb{E}(1/bar{X}_n) = mathbb{E}left (1 - (bar{X}_n-1) + frac{(x-1)^2}{4}+ frac{f'''(varepsilon) (bar{X}_n-1)^2}{2} right)$, and
begin{align*}
mathbb{E}(1/bar{X}_n) &= 1 + frac{f'''(varepsilon) mathbb{E}left ((bar{X}_n-1)^2right )}{2} = 1 +frac{ V(bar{X}_n)}{4} - frac{ V(bar{X}_n)}{12 varepsilon^4}\
end{align*}
and hence
$$1 + frac{ sigma^2}{4 n^2}- frac{sigma^2}{12 v_1^4 n^2} leq mathbb{E}(1/bar{X}_n) leq 1 + frac{ sigma^2}{4 n^2}.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
begin{align*}
mathbb{E}(1/bar{X}_n) &= sum_{i=0}^{+infty} frac{f^{(i)}(1)}{i!}mathbb{E}left((bar{X}_n-1)^iright)\
&= sum_{i=0}^{+infty} frac{(-1)^i}{i!i!}mathbb{E}left((bar{X}_n-1)^iright)
end{align*}
Now if we can say something about the $k^{th}$ moment of $tilde{X}_n$ being $O(1/n^k)$ this validates that $mathbb{E}(1/bar{X}_n) approx 1 + frac{ sigma^2}{4 n^2}$.
New contributor
$endgroup$
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $bar{X}_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/bar{X}_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperator{E}{mathbb{E}}E 1/bar{X}_n ge 1/E bar{X}_n$.
$endgroup$
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $bar{X}_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/bar{X}_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperator{E}{mathbb{E}}E 1/bar{X}_n ge 1/E bar{X}_n$.
$endgroup$
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $bar{X}_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/bar{X}_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperator{E}{mathbb{E}}E 1/bar{X}_n ge 1/E bar{X}_n$.
$endgroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $bar{X}_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/bar{X}_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperator{E}{mathbb{E}}E 1/bar{X}_n ge 1/E bar{X}_n$.
edited yesterday
answered yesterday
kjetil b halvorsenkjetil b halvorsen
31.4k984225
31.4k984225
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $bar{X}_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ frac{f''(1)(x-1)^2}{2} + frac{f'''(varepsilon) (x-1)^2}{2}$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $bar{X}_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbb{E}(1/bar{X}_n) = mathbb{E}left (1 - (bar{X}_n-1) + frac{(x-1)^2}{4}+ frac{f'''(varepsilon) (bar{X}_n-1)^2}{2} right)$, and
begin{align*}
mathbb{E}(1/bar{X}_n) &= 1 + frac{f'''(varepsilon) mathbb{E}left ((bar{X}_n-1)^2right )}{2} = 1 +frac{ V(bar{X}_n)}{4} - frac{ V(bar{X}_n)}{12 varepsilon^4}\
end{align*}
and hence
$$1 + frac{ sigma^2}{4 n^2}- frac{sigma^2}{12 v_1^4 n^2} leq mathbb{E}(1/bar{X}_n) leq 1 + frac{ sigma^2}{4 n^2}.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
begin{align*}
mathbb{E}(1/bar{X}_n) &= sum_{i=0}^{+infty} frac{f^{(i)}(1)}{i!}mathbb{E}left((bar{X}_n-1)^iright)\
&= sum_{i=0}^{+infty} frac{(-1)^i}{i!i!}mathbb{E}left((bar{X}_n-1)^iright)
end{align*}
Now if we can say something about the $k^{th}$ moment of $tilde{X}_n$ being $O(1/n^k)$ this validates that $mathbb{E}(1/bar{X}_n) approx 1 + frac{ sigma^2}{4 n^2}$.
New contributor
$endgroup$
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
add a comment |
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ frac{f''(1)(x-1)^2}{2} + frac{f'''(varepsilon) (x-1)^2}{2}$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $bar{X}_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbb{E}(1/bar{X}_n) = mathbb{E}left (1 - (bar{X}_n-1) + frac{(x-1)^2}{4}+ frac{f'''(varepsilon) (bar{X}_n-1)^2}{2} right)$, and
begin{align*}
mathbb{E}(1/bar{X}_n) &= 1 + frac{f'''(varepsilon) mathbb{E}left ((bar{X}_n-1)^2right )}{2} = 1 +frac{ V(bar{X}_n)}{4} - frac{ V(bar{X}_n)}{12 varepsilon^4}\
end{align*}
and hence
$$1 + frac{ sigma^2}{4 n^2}- frac{sigma^2}{12 v_1^4 n^2} leq mathbb{E}(1/bar{X}_n) leq 1 + frac{ sigma^2}{4 n^2}.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
begin{align*}
mathbb{E}(1/bar{X}_n) &= sum_{i=0}^{+infty} frac{f^{(i)}(1)}{i!}mathbb{E}left((bar{X}_n-1)^iright)\
&= sum_{i=0}^{+infty} frac{(-1)^i}{i!i!}mathbb{E}left((bar{X}_n-1)^iright)
end{align*}
Now if we can say something about the $k^{th}$ moment of $tilde{X}_n$ being $O(1/n^k)$ this validates that $mathbb{E}(1/bar{X}_n) approx 1 + frac{ sigma^2}{4 n^2}$.
New contributor
$endgroup$
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
add a comment |
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ frac{f''(1)(x-1)^2}{2} + frac{f'''(varepsilon) (x-1)^2}{2}$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $bar{X}_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbb{E}(1/bar{X}_n) = mathbb{E}left (1 - (bar{X}_n-1) + frac{(x-1)^2}{4}+ frac{f'''(varepsilon) (bar{X}_n-1)^2}{2} right)$, and
begin{align*}
mathbb{E}(1/bar{X}_n) &= 1 + frac{f'''(varepsilon) mathbb{E}left ((bar{X}_n-1)^2right )}{2} = 1 +frac{ V(bar{X}_n)}{4} - frac{ V(bar{X}_n)}{12 varepsilon^4}\
end{align*}
and hence
$$1 + frac{ sigma^2}{4 n^2}- frac{sigma^2}{12 v_1^4 n^2} leq mathbb{E}(1/bar{X}_n) leq 1 + frac{ sigma^2}{4 n^2}.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
begin{align*}
mathbb{E}(1/bar{X}_n) &= sum_{i=0}^{+infty} frac{f^{(i)}(1)}{i!}mathbb{E}left((bar{X}_n-1)^iright)\
&= sum_{i=0}^{+infty} frac{(-1)^i}{i!i!}mathbb{E}left((bar{X}_n-1)^iright)
end{align*}
Now if we can say something about the $k^{th}$ moment of $tilde{X}_n$ being $O(1/n^k)$ this validates that $mathbb{E}(1/bar{X}_n) approx 1 + frac{ sigma^2}{4 n^2}$.
New contributor
$endgroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ frac{f''(1)(x-1)^2}{2} + frac{f'''(varepsilon) (x-1)^2}{2}$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $bar{X}_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbb{E}(1/bar{X}_n) = mathbb{E}left (1 - (bar{X}_n-1) + frac{(x-1)^2}{4}+ frac{f'''(varepsilon) (bar{X}_n-1)^2}{2} right)$, and
begin{align*}
mathbb{E}(1/bar{X}_n) &= 1 + frac{f'''(varepsilon) mathbb{E}left ((bar{X}_n-1)^2right )}{2} = 1 +frac{ V(bar{X}_n)}{4} - frac{ V(bar{X}_n)}{12 varepsilon^4}\
end{align*}
and hence
$$1 + frac{ sigma^2}{4 n^2}- frac{sigma^2}{12 v_1^4 n^2} leq mathbb{E}(1/bar{X}_n) leq 1 + frac{ sigma^2}{4 n^2}.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
begin{align*}
mathbb{E}(1/bar{X}_n) &= sum_{i=0}^{+infty} frac{f^{(i)}(1)}{i!}mathbb{E}left((bar{X}_n-1)^iright)\
&= sum_{i=0}^{+infty} frac{(-1)^i}{i!i!}mathbb{E}left((bar{X}_n-1)^iright)
end{align*}
Now if we can say something about the $k^{th}$ moment of $tilde{X}_n$ being $O(1/n^k)$ this validates that $mathbb{E}(1/bar{X}_n) approx 1 + frac{ sigma^2}{4 n^2}$.
New contributor
edited yesterday
New contributor
answered yesterday
GopiGopi
1314
1314
New contributor
New contributor
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
add a comment |
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
$begingroup$
Turns out that $bar{X}_n$ does admit $n$ moments and that they are of the form $O(n^{-p/2})$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
23 hours ago
add a comment |
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday