Distribution and density functions  (Page 2/3)

 Page 2 / 3

A distribution function determines the probability mass in each semiinfinite interval $\left(-\infty ,t\right]$ . According to the discussion referred to above, this determines uniquely the induced distribution.

The distribution function F X for a simple random variable is easily visualized. The distribution consists of point mass p i at each point t i in the range. To the left of the smallest value in the range, ${F}_{X}\left(t\right)=0$ ; as t increases to the smallest value t 1 , ${F}_{X}\left(t\right)$ remains constant at zero until it jumps by the amount p 1 . . ${F}_{X}\left(t\right)$ remains constant at p 1 until t increases to t 2 , where it jumps by an amount p 2 to the value ${p}_{1}+{p}_{2}$ . This continues until the value of ${F}_{X}\left(t\right)$ reaches 1 at the largest value t n . The graph of F X is thus a step function, continuous from the right, with a jump in the amount p i at the corresponding point t i in the range. A similar situation exists for a discrete-valued random variable which may take on an infinity of values (e.g., the geometric distributionor the Poisson distribution considered below). In this case, there is always some probability at points to the right of any t i , but this must become vanishingly small as t increases, since the total probability mass is one.

The procedure ddbn may be used to plot the distributon function for a simple random variable from a matrix X of values and a corresponding matrix $PX$ of probabilities.

Graph of F X For a simple random variable

>>c = [10 18 10 3]; % Distribution for X in Example 6.5.1>>pm = minprob(0.1*[6 3 5]);>>canonic Enter row vector of coefficients cEnter row vector of minterm probabilities pm Use row matrices X and PX for calculationsCall for XDBN to view the distribution>>ddbn % Circles show values at jumps Enter row matrix of VALUES XEnter row matrix of PROBABILITIES PX % Printing details See [link] 

Description of some common discrete distributions

We make repeated use of a number of common distributions which are used in many practical situations. This collection includes several distributions which are studied in the chapter "Random Variables and Probabilities" .

1. Indicator function . $X={I}_{E}$ $P\left(X=1\right)=P\left(E\right)=p$ $P\left(X=0\right)=q=1-p$ . The distribution function has a jump in the amount q at $t=0$ and an additional jump of p to the value 1 at $t=1$ .
2. Simple random variable $X=\sum _{i=1}^{n}{t}_{i}{I}_{{A}_{i}}$ (canonical form)
$P\left(X={t}_{i}\right)=P\left({A}_{i}\right)={p}_{1}$
The distribution function is a step function, continuous from the right, with jump of p i at $t={t}_{i}$ (See [link] for [link] )
3. Binomial $\left(n,p\right)$ . This random variable appears as the number of successes in a sequence of n Bernoulli trials with probability p of success. In its simplest form
$X=\sum _{i=1}^{n}{I}_{{E}_{i}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\text{with}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\left\{{E}_{i}:1\le i\le n\right\}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\text{independent}$
$P\left({E}_{i}\right)=p\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}P\left(X=k\right)=C\left(n,\phantom{\rule{0.166667em}{0ex}}k\right){p}^{k}{q}^{n-k}$
As pointed out in the study of Bernoulli sequences in the unit on Composite Trials, two m-functions ibinom and cbinom are available for computing the individual and cumulative binomial probabilities.
4. Geometric $\left(p\right)$ There are two related distributions, both arising in the study of continuing Bernoulli sequences. The first counts the number of failures before the first success. This is sometimes called the “waiting time.” The event $\left\{X=k\right\}$ consists of a sequence of k failures, then a success. Thus
$P\left(X=k\right)={q}^{k}p,\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}0\le k$
The second designates the component trial on which the first success occurs. The event $\left\{Y=k\right\}$ consists of $k-1$ failures, then a success on the k th component trial. We have
$P\left(Y=k\right)={q}^{k-1}p,\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}1\le k$
We say X has the geometric distribution with parameter $\left(p\right)$ , which we often designate by $X\sim$ geometric $\left(p\right)$ . Now $Y=X+1$ or $Y-1=X$ . For this reason, it is customary to refer to the distribution for the number of the trial for the first successby saying $Y-1\sim$ geometric $\left(p\right)$ . The probability of k or more failures before the first success is $P\left(X\ge k\right)={q}^{k}$ . Also
$P\left(X\ge n+k|X\ge n\right)=\frac{P\left(X\ge n+k\right)}{P\left(X\ge n\right)}={q}^{n+k}/{q}^{n}={q}^{k}=P\left(X\ge k\right)$
This suggests that a Bernoulli sequence essentially "starts over" on each trial. If it has failed n times, the probability of failing an additional k or more times before the next success is the same as the initial probability of failing k or more times before the first success.

The geometric distribution

A statistician is taking a random sample from a population in which two percent of the members own a BMW automobile. She takes a sample of size 100. What is the probabilityof finding no BMW owners in the sample?

Solution

The sampling process may be viewed as a sequence of Bernoulli trials with probability $p=0.02$ of success. The probability of 100 or more failures before the first success is $0.{98}^{100}=0.1326$ or about 1/7.5.

5. Negative binomial $\left(m,p\right)$ . X is the number of failures before the m th success. It is generally more convenient to work with $Y=X+m$ , the number of the trial on which the m th success occurs. An examination of the possible patterns and elementary combinatorics show that
$P\left(Y=k\right)=C\left(k-1,m-1\right){p}^{m}{q}^{k-m},\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}m\le k$
There are $m-1$ successes in the first $k-1$ trials, then a success. Each combination has probability ${p}^{m}{q}^{k-m}$ . We have an m-function nbinom to calculate these probabilities.

A game of chance

A player throws a single six-sided die repeatedly. He scores if he throws a 1 or a 6. What is the probability he scores five times in ten or fewer throws?

>> p = sum(nbinom(5,1/3,5:10)) p  =  0.2131

An alternate solution is possible with the use of the binomial distribution . The m th success comes not later than the k th trial iff the number of successes in k trials is greater than or equal to m .

>> P = cbinom(10,1/3,5) P  =  0.2131
6. Poisson $\left(\mu \right)$ . This distribution is assumed in a wide variety of applications.It appears as a counting variable for items arriving with exponential interarrival times (see the relationship to the gamma distribution below). For large n and small p (which may not be a value found in a table), the binomial distribution is approximately Poisson $\left(np\right)$ . Use of the generating function (see Transform Methods) shows the sum ofindependent Poisson random variables is Poisson. The Poisson distribution is integer valued, with
$P\left(X=k\right)={e}^{-\mu }\frac{{\mu }^{k}}{k!}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}0\le k$
Although Poisson probabilities are usually easier to calculate with scientific calculators than binomial probabilities, the use of tables is often quite helpful. As in thecase of the binomial distribution, we have two m-functions for calculating Poisson probabilities. These have advantages of speed and parameter range similar to those for ibinomand cbinom.
• $P\left(X=k\right)$ is calculated by P = ipoisson(mu,k) , where k is a row or column vector of integers and the result P is a row matrix of the probabilities.
• $P\left(X\ge k\right)$ is calculated by P = cpoisson(mu,k) , where k is a row or column vector of integers and the result P is a row matrix of the probabilities.

Poisson counting random variable

The number of messages arriving in a one minute period at a communications network junction is a random variable $N\sim$ Poisson (130). What is the probability the number of arrivals is greater than equal to 110, 120, 130, 140, 150, 160 ?

>> p = cpoisson(130,110:10:160) p  =  0.9666  0.8209  0.5117  0.2011  0.0461  0.0060

The descriptions of these distributions, along with a number of other facts, are summarized in the table DATA ON SOME COMMON DISTRIBUTIONS in Appendix C .

Is there any normative that regulates the use of silver nanoparticles?
what king of growth are you checking .?
Renato
What fields keep nano created devices from performing or assimulating ? Magnetic fields ? Are do they assimilate ?
why we need to study biomolecules, molecular biology in nanotechnology?
?
Kyle
yes I'm doing my masters in nanotechnology, we are being studying all these domains as well..
why?
what school?
Kyle
biomolecules are e building blocks of every organics and inorganic materials.
Joe
anyone know any internet site where one can find nanotechnology papers?
research.net
kanaga
sciencedirect big data base
Ernesto
Introduction about quantum dots in nanotechnology
what does nano mean?
nano basically means 10^(-9). nanometer is a unit to measure length.
Bharti
do you think it's worthwhile in the long term to study the effects and possibilities of nanotechnology on viral treatment?
absolutely yes
Daniel
how to know photocatalytic properties of tio2 nanoparticles...what to do now
it is a goid question and i want to know the answer as well
Maciej
Abigail
for teaching engĺish at school how nano technology help us
Anassong
Do somebody tell me a best nano engineering book for beginners?
there is no specific books for beginners but there is book called principle of nanotechnology
NANO
what is fullerene does it is used to make bukky balls
are you nano engineer ?
s.
fullerene is a bucky ball aka Carbon 60 molecule. It was name by the architect Fuller. He design the geodesic dome. it resembles a soccer ball.
Tarell
what is the actual application of fullerenes nowadays?
Damian
That is a great question Damian. best way to answer that question is to Google it. there are hundreds of applications for buck minister fullerenes, from medical to aerospace. you can also find plenty of research papers that will give you great detail on the potential applications of fullerenes.
Tarell
what is the Synthesis, properties,and applications of carbon nano chemistry
Mostly, they use nano carbon for electronics and for materials to be strengthened.
Virgil
is Bucky paper clear?
CYNTHIA
carbon nanotubes has various application in fuel cells membrane, current research on cancer drug,and in electronics MEMS and NEMS etc
NANO
so some one know about replacing silicon atom with phosphorous in semiconductors device?
Yeah, it is a pain to say the least. You basically have to heat the substarte up to around 1000 degrees celcius then pass phosphene gas over top of it, which is explosive and toxic by the way, under very low pressure.
Harper
Do you know which machine is used to that process?
s.
how to fabricate graphene ink ?
for screen printed electrodes ?
SUYASH
What is lattice structure?
of graphene you mean?
Ebrahim
or in general
Ebrahim
in general
s.
Graphene has a hexagonal structure
tahir
On having this app for quite a bit time, Haven't realised there's a chat room in it.
Cied
what is biological synthesis of nanoparticles
how did you get the value of 2000N.What calculations are needed to arrive at it
Privacy Information Security Software Version 1.1a
Good
A fair die is tossed 180 times. Find the probability P that the face 6 will appear between 29 and 32 times inclusive   By Prateek Ashtikar  By   By  