# 0.11 Decision trees  (Page 4/5)

 Page 4 / 5

We define our estimator as

${\stackrel{^}{f}}_{n}^{H}={\stackrel{^}{f}}_{n}^{\stackrel{^}{k}},$

where

${\stackrel{^}{f}}_{n}^{\left(k\right)}=arg\underset{f\in {\mathcal{F}}_{k}^{H}}{min}{\stackrel{^}{R}}_{n}\left(f\right),$

and

$\stackrel{^}{k}=arg\underset{k\ge 1}{min}\left\{{\stackrel{^}{R}}_{n},\left(,{\stackrel{^}{f}}_{n}^{\left(k\right)},+,\sqrt{\frac{\left(k+{k}^{2}\right)log2+\frac{1}{2}logn}{2n}}\right\}.$

Therefore ${\stackrel{^}{f}}_{n}^{H}$ minimizes

${\stackrel{^}{R}}_{n}\left(f\right)+\sqrt{\frac{{c}_{H}\left(f\right)log2+\frac{1}{2}logn}{2n}},$

over all $f\in {\mathcal{F}}^{H}$ . We showed before that

$E\left[R\left({\stackrel{^}{f}}_{n}^{H}\right)\right]-{R}^{*}\le \underset{f\in {\mathcal{F}}^{H}}{min}\left\{R,\left(f\right),-,{R}^{*},+,\sqrt{\frac{{c}_{H}\left(f\right)log2+\frac{1}{2}logn}{2n}}\right\}+\frac{1}{\sqrt{n}}.$

To proceed with our analysis we need to make some assumptions on the intrinsic difficulty of the problem. We will assume that theBayes decision boundary is a “well-behaved” 1-dimensional set, in the sense that it has box-counting dimension one (seeAppendix  "Box Counting Dimension" ). This implies that, for an histogram with ${k}^{2}$ bins, the Bayes decision boundary intersects less than $Ck$ bins, where $C$ is a constant that does not depend on $k$ . Furthermore we assume that the marginal distribution of $X$ satisfies ${P}_{X}\left(A\right)\le K|A|$ , for any measurable subset $A\subseteq {\left[0,1\right]}^{2}$ . This means that the samples collected do not accumulate anywhere in the unit square.

Under the above assumptions we can conclude that

$\underset{f\in {\mathcal{F}}_{k}^{H}}{min}R\left(f\right)-{R}^{*}\phantom{\rule{4pt}{0ex}}\le \phantom{\rule{4pt}{0ex}}\frac{K}{{k}^{2}}Ck=\frac{CK}{k}.$

Therefore

$E\left[R\left({\stackrel{^}{f}}_{n}^{H}\right)\right]-{R}^{*}\phantom{\rule{4pt}{0ex}}\le \phantom{\rule{4pt}{0ex}}CK/k+\sqrt{\frac{\left(k+{k}^{2}\right)log2+\frac{1}{2}logn}{2n}}+\frac{1}{\sqrt{n}}.$

We can balance the terms in the right side of the above expression using $k={n}^{1/4}$ (for $n$ large) therefore

$E\left[R\left({\stackrel{^}{f}}_{n}^{H}\right)\right]-{R}^{*}\phantom{\rule{4pt}{0ex}}=\phantom{\rule{4pt}{0ex}}O\left({n}^{-1/4}\right),\phantom{\rule{4pt}{0ex}}\phantom{\rule{4.pt}{0ex}}\text{as}\phantom{\rule{4.pt}{0ex}}n\to \infty .$

Now let's consider the dyadic decision trees, under the assumptions above, and contrast these with the histogram classifier. Let

${\mathcal{F}}_{k}^{T}=\left\{\text{tree}\phantom{\rule{4.pt}{0ex}}\text{classifiers}\phantom{\rule{4.pt}{0ex}}\text{with}\phantom{\rule{4.pt}{0ex}}k\phantom{\rule{4.pt}{0ex}}\text{leafs}\right\}.$

Let ${\mathcal{F}}^{T}={\bigcup }_{k\ge 1}{\mathcal{F}}_{k}^{T}$ . We can prefix encode each element $f$ of ${\mathcal{F}}^{T}$ with ${c}_{T}\left(f\right)=3k-1$ bits, as described before.

Let

${\stackrel{^}{f}}_{n}^{T}={\stackrel{^}{f}}_{n}^{\left(\stackrel{^}{k}\right)},$

where

${\stackrel{^}{f}}_{n}^{\left(k\right)}=arg\underset{f\in {\mathcal{F}}_{k}^{T}}{min}{\stackrel{^}{R}}_{n}\left(f\right),$

and

$\stackrel{^}{k}=arg\underset{k\ge 1}{min}\left\{{\stackrel{^}{R}}_{n},\left(,{\stackrel{^}{f}}_{n}^{\left(k\right)},+,\sqrt{\frac{\left(3k-1\right)log2+\frac{1}{2}logn}{2n}}\right\}.$

Hence ${\stackrel{^}{f}}_{n}^{T}$ minimizes

${\stackrel{^}{R}}_{n}\left(f\right)+\sqrt{\frac{{c}_{T}\left(f\right)log2+\frac{1}{2}logn}{2n}},$

over all $f\in {\mathcal{F}}^{T}$ . Moreover

$E\left[R\left({\stackrel{^}{f}}_{n}^{T}\right)\right]-{R}^{*}\phantom{\rule{4pt}{0ex}}\le \phantom{\rule{4pt}{0ex}}\underset{f\in {\mathcal{F}}^{T}}{min}\left\{R,\left(f\right),-,{R}^{*},+,\sqrt{\frac{{c}_{T}\left(f\right)log2+\frac{1}{2}logn}{2n}}\right\}+\frac{1}{\sqrt{n}}.$

If the Bayes decision boundary is a 1-dimensional set, as in "Histogram Risk Bound" , there exists a tree with at most $8Ck$ leafs such that the boundary is contained in at most $Ck$ squares, each of volume $1/{k}^{2}$ . To see this, start with a tree yielding the histogram partition with ${k}^{2}$ boxes ( i.e., the tree partitioning the unit square into ${k}^{2}$ equal sized squares). Now prune all the nodes that do not intersect the boundary. In [link] we illustrate the procedure. If you carefully bound the number of leafs you need at each level you canshow that you will have in total less than $8Ck$ leafs. We conclude then that there exists a tree with at most $8Ck$ leafs that has the same risk as a histogram with $O\left({k}^{2}\right)$ bins. Therefore, using [link] we have

$E\left[R\left({\stackrel{^}{f}}_{n}^{T}\right)\right]-{R}^{*}\phantom{\rule{4pt}{0ex}}\le \phantom{\rule{4pt}{0ex}}CK/k+\sqrt{\frac{\left(3\left(8Ck\right)-1\right)log2+\frac{1}{2}logn}{2n}}+\frac{1}{\sqrt{n}}.$

We can balance the terms in the right side of the above expression using $k={n}^{1/3}$ (for $n$ large) therefore

$E\left[R\left({\stackrel{^}{f}}_{n}^{T}\right)\right]-{R}^{*}\phantom{\rule{4pt}{0ex}}=\phantom{\rule{4pt}{0ex}}O\left({n}^{-1/3}\right),\phantom{\rule{4pt}{0ex}}\phantom{\rule{4.pt}{0ex}}\text{as}\phantom{\rule{4.pt}{0ex}}n\to \infty .$ Illustration of the tree pruning procedure: (a) Histogram classification rule, for a partition with 16 bins, andcorresponding binary tree representation (with 16 leafs). (b) Pruned version of the histogram tree, yielding exactly the sameclassification rule, but now requiring only 6 leafs. ( Note: The trees where constructed using the procedure ofFigure )

Trees generally work much better than histogram classifiers. This is essentially because they provide much more efficient ways ofapproximating the Bayes decision boundary (as we saw in our example, under reasonable assumptions on the Bayes boundary, a tree encodedwith $O\left(k\right)$ bits can describe the same classifier as an histogram that requires $O\left({k}^{2}\right)$ bits).

what are the products of Nano chemistry?
There are lots of products of nano chemistry... Like nano coatings.....carbon fiber.. And lots of others..
learn
Even nanotechnology is pretty much all about chemistry... Its the chemistry on quantum or atomic level
learn
Preparation and Applications of Nanomaterial for Drug Delivery
Application of nanotechnology in medicine
what is variations in raman spectra for nanomaterials
I only see partial conversation and what's the question here!
what about nanotechnology for water purification
please someone correct me if I'm wrong but I think one can use nanoparticles, specially silver nanoparticles for water treatment.
Damian
yes that's correct
Professor
I think
Professor
what is the stm
is there industrial application of fullrenes. What is the method to prepare fullrene on large scale.?
Rafiq
industrial application...? mmm I think on the medical side as drug carrier, but you should go deeper on your research, I may be wrong
Damian
How we are making nano material?
what is a peer
What is meant by 'nano scale'?
What is STMs full form?
LITNING
scanning tunneling microscope
Sahil
how nano science is used for hydrophobicity
Santosh
Do u think that Graphene and Fullrene fiber can be used to make Air Plane body structure the lightest and strongest. Rafiq
Rafiq
what is differents between GO and RGO?
Mahi
what is simplest way to understand the applications of nano robots used to detect the cancer affected cell of human body.? How this robot is carried to required site of body cell.? what will be the carrier material and how can be detected that correct delivery of drug is done Rafiq
Rafiq
if virus is killing to make ARTIFICIAL DNA OF GRAPHENE FOR KILLED THE VIRUS .THIS IS OUR ASSUMPTION
Anam
analytical skills graphene is prepared to kill any type viruses .
Anam
Any one who tell me about Preparation and application of Nanomaterial for drug Delivery
Hafiz
what is Nano technology ?
write examples of Nano molecule?
Bob
The nanotechnology is as new science, to scale nanometric
brayan
nanotechnology is the study, desing, synthesis, manipulation and application of materials and functional systems through control of matter at nanoscale
Damian
Is there any normative that regulates the use of silver nanoparticles?
what king of growth are you checking .?
Renato
What fields keep nano created devices from performing or assimulating ? Magnetic fields ? Are do they assimilate ?
why we need to study biomolecules, molecular biology in nanotechnology?
?
Kyle
yes I'm doing my masters in nanotechnology, we are being studying all these domains as well..
why?
what school?
Kyle
biomolecules are e building blocks of every organics and inorganic materials.
Joe
anyone know any internet site where one can find nanotechnology papers?
research.net
kanaga
sciencedirect big data base
Ernesto
Got questions? Join the online conversation and get instant answers! By OpenStax By Richley Crapo By Jonathan Long By Madison Christian By Marriyam Rana By Sean WiffleBoy By Madison Christian By OpenStax By Richley Crapo By Steve Gibbs