diff --git a/.travis.yml b/.travis.yml deleted file mode 100644 index a7287ae30..000000000 --- a/.travis.yml +++ /dev/null @@ -1,9 +0,0 @@ -language: python -# command to install dependencies -install: - - pip install -r requirements.txt -python: - - "3.6" # current default Python on Travis CI -# command to run tests -script: - - pytest diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img1.JPG b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img1.JPG new file mode 100644 index 000000000..6859dd90e Binary files /dev/null and b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img1.JPG differ diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img2.JPG b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img2.JPG new file mode 100644 index 000000000..c1e71ee83 Binary files /dev/null and b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img2.JPG differ diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img3.JPG b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img3.JPG new file mode 100644 index 000000000..916685de4 Binary files /dev/null and b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img3.JPG differ diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img4.JPG b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img4.JPG new file mode 100644 index 000000000..19456cd10 Binary files /dev/null and b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/img/lez11-img4.JPG differ diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.aux b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.aux index 2258b5eb3..0b07e133c 100644 --- a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.aux +++ b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.aux @@ -1,6 +1,7 @@ \relax \@nameuse{bbl@beforestart} \babel@aux{english}{} -\@writefile{toc}{\contentsline {chapter}{\numberline {1}Lecture 10 - 07-04-2020}{1}\protected@file@percent } +\@writefile{toc}{\contentsline {chapter}{\numberline {1}Lecture 11 - 20-04-2020}{1}\protected@file@percent } \@writefile{lof}{\addvspace {10\p@ }} \@writefile{lot}{\addvspace {10\p@ }} +\@writefile{toc}{\contentsline {section}{\numberline {1.1}Analysis of $K_{NN}$}{1}\protected@file@percent } diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.log b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.log index 7094189aa..bea17bf5e 100644 --- a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.log +++ b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.log @@ -1,4 +1,4 @@ -This is pdfTeX, Version 3.14159265-2.6-1.40.21 (MiKTeX 2.9.7300 64-bit) (preloaded format=pdflatex 2020.4.13) 20 APR 2020 08:41 +This is pdfTeX, Version 3.14159265-2.6-1.40.21 (MiKTeX 2.9.7300 64-bit) (preloaded format=pdflatex 2020.4.13) 20 APR 2020 09:36 entering extended mode **./lecture11.tex (lecture11.tex @@ -238,7 +238,7 @@ File: l3backend-pdfmode.def 2020-03-12 L3 backend support: PDF mode \l__kernel_color_stack_int=\count193 \l__pdf_internal_box=\box48 ) -No file lecture11.aux. +(lecture11.aux) \openout1 = `lecture11.aux'. LaTeX Font Info: Checking defaults for OML/cmm/m/it on input line 2. @@ -255,6 +255,7 @@ LaTeX Font Info: Checking defaults for OMX/cmex/m/n on input line 2. LaTeX Font Info: ... okay on input line 2. LaTeX Font Info: Checking defaults for U/cmr/m/n on input line 2. LaTeX Font Info: ... okay on input line 2. + ("C:\Program Files\MiKTeX 2.9\tex/context/base/mkii\supp-pdf.mkii" [Loading MPS to PDF converter (version 2006.09.02).] \scratchcounter=\count194 @@ -308,25 +309,709 @@ G,.JBIG2,.JB2,.eps] (grfext) \AppendGraphicsExtensions on input line 504. ) Chapter 1. +LaTeX Font Info: Trying to load font information for U+msa on input line 6. +("C:\Program Files\MiKTeX 2.9\tex/latex/amsfonts\umsa.fd" +File: umsa.fd 2013/01/14 v3.01 AMS symbols A +) +LaTeX Font Info: Trying to load font information for U+msb on input line 6. + +("C:\Program Files\MiKTeX 2.9\tex/latex/amsfonts\umsb.fd" +File: umsb.fd 2013/01/14 v3.01 AMS symbols B +) +! Missing delimiter (. inserted). + + { +l.8 \barra{E} \left{ + \ell_d} (\hat{\ell}_s ) \right] \leq 2 \cdot \ell_D \lef... +I was expecting to see something like `(' or `\{' or +`\}' here. If you typed, e.g., `{' instead of `\{', you +should probably delete the `{' by typing `1' now, so that +braces don't get unbalanced. Otherwise just proceed. +Acceptable delimiters are characters whose \delcode is +nonnegative, or you can use `\delimiter '. + +! Missing } inserted. + + } +l.8 ...ra{E}\left[ \, \| X = x_{\Pi(s,x) \| \right + ] +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + + +Underfull \hbox (badness 10000) in paragraph at lines 9--34 + + [] + + +Underfull \hbox (badness 10000) in paragraph at lines 9--34 + + [] + + +Underfull \hbox (badness 10000) in paragraph at lines 40--55 + + [] + +! Missing delimiter (. inserted). + + \varepsilon +l.56 ...| \right] \leq \barra{E} \left \varepsilon + \sqrt[]{d} \sum_{i = 1}^{... +I was expecting to see something like `(' or `\{' or +`\}' here. If you typed, e.g., `{' instead of `\{', you +should probably delete the `{' by typing `1' now, so that +braces don't get unbalanced. Otherwise just proceed. +Acceptable delimiters are characters whose \delcode is +nonnegative, or you can use `\delimiter '. + +! Extra \right. +l.56 ...i \} \cdot I\{C_i \cap S \neq 0 \} \right] + = +I'm ignoring a \right that had no matching \left. + + +Overfull \hbox (104.72401pt too wide) detected at line 57 +\U/msb/m/n/12 E [] \OMS/cmsy/m/n/12  \U/msb/m/n/12 E [] \OT1/cmr/m/n/12 + 2 \O +MS/cmsy/m/n/12  [] [] \OML/cmm/m/it/12 I\OMS/cmsy/m/n/12 f\OML/cmm/m/it/12 X \ +OMS/cmsy/m/n/12 2 \OML/cmm/m/it/12 C[]\OMS/cmsy/m/n/12 g  \OML/cmm/m/it/12 I\O +MS/cmsy/m/n/12 f\OML/cmm/m/it/12 C[] \OMS/cmsy/m/n/12 \ \OML/cmm/m/it/12 S \OMS +/cmsy/m/n/12 6\OT1/cmr/m/n/12 = 0\OMS/cmsy/m/n/12 g \OT1/cmr/m/n/12 = + [] + +! Missing } inserted. + + } +l.59 ..._i \} I \{c_1 \cap S \neq 0 \} \right] } + + 2 \sqrt[]{d} \sum_{i = ... +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Extra }, or forgotten $. +\@textcolor ...otect \leavevmode {\color #1{#2}#3} + +l.59 ..._i \} I \{c_1 \cap S \neq 0 \} \right] } + + 2 \sqrt[]{d} \sum_{i = ... +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + + +Overfull \hbox (15.41321pt too wide) detected at line 60 +\OT1/cmr/m/n/12 = \OML/cmm/m/it/12 "[]\U/msb/m/n/12 E [] \OT1/cmr/m/n/12 + 2[] +[] \U/msb/m/n/12 E [] + [] + [1 {C:/Users/AndreDany/AppData/Local/MiKTeX/2.9/pdftex/config/pdftex.map}] -(lecture11.aux) ) +! Extra \right. + ...\} I \{c_1 \cap S \neq 0 \} \right ] + $ +l.61 ...i \} I \{c_1 \cap S \neq 0 \} \right] $} + +I'm ignoring a \right that had no matching \left. + +! Missing number, treated as zero. + + $ +l.68 $ + $ +A number should have been here; I inserted `0'. +(If you can't figure out why I needed to see a number, +look up `weird error' in the index to The TeXbook.) + +! Undefined control sequence. +l.74 \barra{E} \left \I + \{X \in C_i \} \cdot I \{ C_1 \cap S \neq 0 \} \right... +The control sequence at the end of the top line +of your error message was never \def'ed. If you have +misspelled it (e.g., `\hobx'), type `I' and the correct +spelling (e.g., `I\hbox'). Otherwise just continue, +and I'll forget about whatever was undefined. + +! Missing \right. inserted. + + \right . +l.75 $ + $ +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + + +Underfull \hbox (badness 10000) in paragraph at lines 75--78 + + [] + +! Missing $ inserted. + + $ +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Extra }, or forgotten $. +\textdef@ ...th {#1}\let \f@size #2\selectfont #3} + } +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\textdef@ ...h {#1}\let \f@size #2\selectfont #3}} + +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\text@ ...e {\textdef@ \displaystyle \f@size {#1}} + {\textdef@ \textstyle \f@s... +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Missing $ inserted. + + $ +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Extra }, or forgotten $. +\textdef@ ...th {#1}\let \f@size #2\selectfont #3} + } +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\textdef@ ...h {#1}\let \f@size #2\selectfont #3}} + +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\text@ ...xtstyle \f@size {\firstchoice@false #1}} + {\textdef@ \textstyle \sf@... +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Missing $ inserted. + + $ +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Extra }, or forgotten $. +\textdef@ ...th {#1}\let \f@size #2\selectfont #3} + } +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\textdef@ ...h {#1}\let \f@size #2\selectfont #3}} + +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\text@ ...tstyle \sf@size {\firstchoice@false #1}} + {\textdef@ \textstyle \ssf... +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Missing $ inserted. + + $ +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Extra }, or forgotten $. +\textdef@ ...th {#1}\let \f@size #2\selectfont #3} + } +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\textdef@ ...h {#1}\let \f@size #2\selectfont #3}} + +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\text@ ...style \ssf@size {\firstchoice@false #1}} + \check@mathfonts } +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\text@ ...firstchoice@false #1}}\check@mathfonts } + +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\@textcolor ...otect \leavevmode {\color #1{#2}#3} + +l.93 ...0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} + = +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Missing $ inserted. + + $ +l.95 where $\bred{p \, e ^{-m\,p}} + $ is $F(p)$ +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Extra }, or forgotten $. + \egroup + +l.95 where $\bred{p \, e ^{-m\,p}} + $ is $F(p)$ +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Extra }, or forgotten $. +\@textcolor ...otect \leavevmode {\color #1{#2}#3} + +l.95 where $\bred{p \, e ^{-m\,p}} + $ is $F(p)$ +I've deleted a group-closing symbol because it seems to be +spurious, as in `$x}$'. But perhaps the } is legitimate and +you forgot something else, as in `\hbox{$x}'. In such cases +the way to recover is to insert both the forgotten and the +deleted material, e.g., by typing `I$}'. + +! Missing $ inserted. + + $ +l.99 F^ + (p) = 0 \Leftrightarrow p = \frac{1}{m} \quad check! +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Missing $ inserted. + + $ +l.108 \end{document} + +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Improper \prevdepth. +\newpage ...everypar {}\fi \par \ifdim \prevdepth + >\z@ \vskip -\ifdim \prevd... +l.108 \end{document} + +You can refer to \spacefactor only in horizontal mode; +you can refer to \prevdepth only in vertical mode; and +neither of these is meaningful inside \write. So +I'm forgetting what you said and using zero instead. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing $ inserted. + + $ +l.108 \end{document} + +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing $ inserted. + + $ +l.108 \end{document} + +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing $ inserted. + + $ +l.108 \end{document} + +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing $ inserted. + + $ +l.108 \end{document} + +I've inserted a begin-math/end-math symbol since I think +you left one out. Proceed, with fingers crossed. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing { inserted. + + $ +l.108 \end{document} + +A left brace was mandatory here, so I've put one in. +You might want to delete and/or insert some corrections +so that I will find a matching right brace soon. +(If you're confused by all this, try typing `I}' now.) + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing { inserted. + + $ +l.108 \end{document} + +A left brace was mandatory here, so I've put one in. +You might want to delete and/or insert some corrections +so that I will find a matching right brace soon. +(If you're confused by all this, try typing `I}' now.) + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing { inserted. + + $ +l.108 \end{document} + +A left brace was mandatory here, so I've put one in. +You might want to delete and/or insert some corrections +so that I will find a matching right brace soon. +(If you're confused by all this, try typing `I}' now.) + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Missing } inserted. + + } +l.108 \end{document} + +I've inserted something that you may have forgotten. +(See the above.) +With luck, this will get me unwedged. But if you +really didn't forget anything, try typing `2' now; then +my insertion and my current dilemma will both disappear. + +! Display math should end with $$. + + \vfil +l.108 \end{document} + +The `$' that I just saw supposedly matches a previous `$$'. +So I shall assume that you typed `$$' both times. + + +Overfull \hbox (522.37697pt too wide) detected at line 108 +\OMS/cmsy/m/n/12  [] []  \OML/cmm/m/it/12 r [] [][] + [] + +[2] (lecture11.aux) ) Here is how much of TeX's memory you used: - 5026 strings out of 480934 - 67628 string characters out of 2909670 - 328074 words of memory out of 3000000 - 20811 multiletter control sequences out of 15000+200000 - 534950 words of font info for 28 fonts, out of 3000000 for 9000 + 5098 strings out of 480934 + 68778 string characters out of 2909670 + 334074 words of memory out of 3000000 + 20855 multiletter control sequences out of 15000+200000 + 544996 words of font info for 60 fonts, out of 3000000 for 9000 1141 hyphenation exceptions out of 8191 - 42i,5n,50p,333b,124s stack positions out of 5000i,500n,10000p,200000b,50000s - -Output written on lecture11.pdf (1 page, 7247 bytes). + 42i,16n,50p,333b,256s stack positions out of 5000i,500n,10000p,200000b,50000s + < +C:\Users\AndreDany\AppData\Local\MiKTeX\2.9\fonts/pk/ljfour/jknappen/ec/dpi600\ +ecbx1728.pk> +Output written on lecture11.pdf (2 pages, 166486 bytes). PDF statistics: - 27 PDF objects out of 1000 (max. 8388607) + 197 PDF objects out of 1000 (max. 8388607) 0 named destinations out of 1000 (max. 500000) 1 words of extra memory for PDF output out of 10000 (max. 10000000) diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.pdf b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.pdf index ae631a6e3..5adb236a3 100644 Binary files a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.pdf and b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.pdf differ diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.synctex.gz b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.synctex.gz index bd0863d7c..2b00f4492 100644 Binary files a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.synctex.gz and b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.synctex.gz differ diff --git a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.tex b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.tex index e9ac936c7..2cdd43acf 100644 --- a/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.tex +++ b/1year/3trimester/Machine Learning, Statistical Learning, Deep Learning and Artificial Intelligence/Machine Learning/lectures/lecture11.tex @@ -3,5 +3,253 @@ \chapter{Lecture 11 - 20-04-2020} +\section{Analysis of $\knn$} +$$ +\barra{E} \left{\ell_d} (\hat{\ell}_s ) \right] \leq 2 \cdot \ell_D \left( f^* \right) + c \cdot \barra{E}\left[ \, \| X = x_{\Pi(s,x) \| \right] +$$ +At which rate this thing goes down? If number of dimension goes up then a lot of point are far away from $X$. \\ +So this quantity must depend on the space in which X live. +\\ Some dependence on number of depends and incresaing number of traning points close to $X$\\ +This expecation is fucniton of random variable X and $X_{\pi(s,x)}$ +\\\\ +We are going to use the assumption that: +\\ +$| X_t | \leq 1 \qquad \forall $ cordinates $i = 1, ..., d$ +\\ +--- DISEGNO --- +\\ +Hyper box in bydimension. All point live in this box and we exploit that. +Look at the little suare in which is divided and we assume that we are dividing the box in small boxes of size $\varepsilon$. Now the training points will be a strincle of point distributed in the big square. \\ +Our training points are distribuited in the box (this is our S). +\\ +Now we added a point x and given this two things can happned: +falls in the square with training points or in a square without training points. +\\ +What is going to be the distance $X_{\pi(s,x)}$ in this two cases? +\\ +We have $c_1$ up to $c_r$ +How big is this when we have this two cases? +(We lookjing at specific choices of x and s) +\\ +$$ +\| X - X_{s,x} \| leq +\begin{cases} +\varepsilon \sqrt[]{d} \qquad c_i \cup S \neq 0 \\ +\sqrt[]{d} \qquad c_i \cup S = 0 +\end{cases} +$$ +were $X \in C_i$ +\\ +We have to multiply by the lenght of the cube. +Will be $\varepsilon \sqrt[]{d}$ +\\ +-- DISEGNO --- +\\ +If things go badly can be very far away like the length of the domain. +Lenght is $2$ and diagonal is $ \sqrt[]{d}$ +\\ +if close they are going to be $\varepsilon close$ or far as domain. +\\ +We can split that the expression inside the expectation according to the two cases. +\\ +$$ +\barra{E} \left[ \| X - X_{\Pi(s,x)} \| \right] \leq \barra{E} \left \varepsilon \sqrt[]{d} \sum_{i = 1}^{r} I \{ X \in C_i \} \cdot I \{ C_1\cap S \neq 0 \} \right] + 2 \cdot \sqrt[]{d} \sum_{i=1}^{r} I \{ X \in C_i \} \cdot I\{C_i \cap S \neq 0 \} \right] = +$$ +$$ += \varepsilon \sqrt[]{d} \barra{E} \left[ \red{\sum_{t = 1} ^ {r} I \{ X \in C_i \} I \{c_1 \cap S \neq 0 \} \right] } + 2 \sqrt[]{d} \sum_{i = 1}^{r} \barra{E} \left[ I \{ X \in C_1 \} \cdot I \{ C_1 \cap S \neq 0 \} \right] +$$ +I don't care about this one \bred{$\sum_{t = 1} ^ {r} I \{ X \in C_i \} I \{c_1 \cap S \neq 0 \} \right] $} +\\ +Can be either $0$ or $1$ (if for some $i$, $X$ belong to some $C_i$ +\\ +So at most 1 the expectation +$$ +\leq \ \varepsilon \sqrt[]{d} + \box +$$ +We can bound this square. Are the event I in the summation of the term after +. If they are indepednt the product will be the product of the two expectation. If I fix the cube. +$X$ and $S$ are independent. +\\ +Now the two events are independent \\ \bred{ $X \in C_1$ is inepdend of $C_i \cap S \neq $} +$$ +\barra{E} \left \I\{X \in C_i \} \cdot I \{ C_1 \cap S \neq 0 \} \right] = \barra{E} \left[ I \{ X \in C_i \} \right] \cdot \barra{E} \left[ I \{ C_i +$$ +MANCAAAAAAA 9.26 +\\ +$$ +\barra{P} \left( C_i \cap S \right) = \left( 1- \barra{P} \left( X \in C_1 \right) \right)^m \leq \exp (- m \barra{O} (x \in C_1 )) +$$ +The probability of the point fall there and will be the probability of falling in the cube. +\\ +Probability of Xs to fall in the cube with a m (samples?) +\\ +Now use inequality $ (1 - p)^m \in e^{-pm}$ $\longrightarrow$ $1 + x \leq e^x$ +-- IMMAGINE -- +$$ +\sum_{t = 1}^{r} \barra{E} \left[ \barra{P} (X \in C_1 ) \barra{P} (C_1 \cap S \neq ) \right] \leq sum_{i = 1}^{r} p_i \, e^{-m \, p_i} \leq +$$ +given that $p_i = \barra{P} (X \in C_i)$ +I can upper bound this +$$ +\leq \sum_{t=1}^{r} \left( \max_{0 \leq p \leq 1} \, p \, e^{- m \, p} \right) \leq r \, \max_{0 \leq p \leq 1} \, \bred{p \, e ^{-m\,p}} = +$$ +where $\bred{p \, e ^{-m\,p}}$ is $F(p)$ +it is concave function so i'm going to take first order derivative to maximise it. +\\ +$$ +F^(p) = 0 \Leftrightarrow p = \frac{1}{m} \quad check! +$$ +$$ +F''(p) \leq 0 +$$ +Check this two condition! +$$ + = \frac{r}{e \, m} +$$ +\\ +Now get expectation +\\ +$$ +\barra{E} \left[ \| X - X_{\Pi(s,x)} \| \right] \leq \varepsilon \sqrt[]{d} + \left( 2 \cdot \sqrt[]{d}\right) \frac{r}{e \, m} = +$$ +I have $(2/epsilon)^2$ squares. +This bring $\variepsilon$ in the game +$$ +\varepsilon \sqrt[]{d} + \left( 2 \cdot \sqrt[]{d}\right) \frac{1}{e \, m} \cdot \left( \frac{2}{\varepsilon}\right) ^d = +$$ +$$ += \sqrt[]{d} \left( \varepsilon + \frac{2}{e \, m } \cdot \left( \frac{2}{\varepsilon}^d \right) \right) +$$ +\blue{HE MISS THE "c" costant from the start} +we can choose $\varepsilon$ to take them balanced +\\set $\varepsilon = 2 \, m^{\frac{-1}{(d+1)}} $ +$$ +\bred{ \left( \varepsilon + \frac{2}{e \, m } \cdot \left( \frac{2}{\varepsilon}^d \right) \leq 4 \, m ^{\frac{-1}{(d+1)}} = } +$$ +\\ +$$ +\barra{E} \left[ \ell_d (\hat{h}_s) \leq 2 \ell_d (f^*) + 4 \cdot c \cdot \sqrt[]{d} \cdot m^{-\frac{1}{d+1}} +$$ +We have that:\\ +if $m \longrightarrow \infty$ \quad $\ell_D (f^*) \leq \barra{E} \left[ \ell_D (\hat{h}_s \right} \leq 2 \1ell_D(f^*)$ +\\ +I want this smaller than twice risk + some small quantity +$$ +\barra{E} \left[ \ell_d(\hat{h}_S) \right] \leq 2 \ell_D(f^*) + \varepsilon +$$ +How big $m$ ?\\ +Ignore this part since very small $(4 c \sqrt[]{•d})$ +\\ +$$ +m ^ {-\frac{1}{d+1}} \leq \varepsilon \Leftrightarrow m \beq (\frac{1}{\varepsilon}^d+1 +$$ +So 1-NN require a training set size exponential "accuracy" \ $1-\varepsilon$ +\\\\ +We show that $1-NN$ can approach twice based risk $2 \ell_D(f^*)$\\ +but it takes a training set exponential in $d$. +\\ +\subsection{Study of $\knn$} +Maybe we can use the $\knn$. +$$ +\barra{E}\left[ \ell_D(\hat{h}_s)\right] \leq \left( 1+ \sqrt[]{\frac{8}{k}}\right) \ell_D(f^*) + 0 \, \left((k \, m^{-\frac{1}{d+1}}\right) +$$ +So is not exponential here. +\\ +\bred{Learning algorithm $A$ is consistent for a certain loss $\ell$} +\\ +If $\forall \, D$(distribution) of data we have that $A(S_m)$ predictor output by $A$\\. +Now have the risk of that in $\ell_D(A(S_m))$ and we look at the expectation $\barra{E}\left[\ell_D(A(S_m))\right]$ +If we give a training set size large ($\lim_{m \rightarrow \infty} \, \barra{E}\left[\ell_D(A(S_m))\right] = \ell_D (f^*)$ risk will converge in based risk. +\\\\ +$\knn$ where $K = K_m$ (is a function of training set size). +$K_, \rightarrow \infty $ as $m \rightarrow \infty$. +\\ +Only way $K$ goes to infinity is sublinearly of training set size. (infinity but so as quicly as $m$ +$K_m = O(m)$ +\\\\ +For instance $K_m = \sqrt[]{m}$ +\\ +Then: +$$ +\lim_{m \rightarrow \infty} \barra{E} \left[ \ell_D\left(A'\left(S_m\right)\right) \right] = \ell_D(f^*) \qquad \textbf{where $A'$ is $K_m-NN$ } +$$ +Increasing the size we will converge to this base risk for any distribution and that's nice. +\\\\ +\subsection{study of trees} +Algorithm that grow tree classifiers can also be made consistent provided two condition: +\begin{itemize} +\item The tree keeps growing +\item A non-vanishing fraction of traning example is routed to each leaft +\end{itemize} +Tree has to keep growing but not so fast.\\ +Second point is: suppose you have a certain number of leaves and you can look at the fraction. +Each leaf $\ell$ gets $N_\ell$ examples. You want that this fraction at any point of time is not going to 0. The fraction of point every leaf receive a split we are reducing the smallest number of examples. +\\Example keep growing and leaves too and we want that $\frac{N_\ell$}{$.$} this not going to 0. $.$ since not showed the formula. +\\\\ +Given $A$, how do I know wheter $A$ could be consistent? +$$ +H_A \equiv \{ \ h \ : \ \exists S \ A(S) = h \} +$$ +$S$ can be any size. +If $A$ is $ERM$ then $H_A = H$, so where ERM minimise it. +\\ +If $\exists f^* : X \longrightarrow $ such that $f^* \not{\in} H_A$ and $\exists D$ such that $f^*$ is Bayes optimal for some distribution $D$. +This cannot be consistent because distribution will not be able to generate the Bayes optimal predictor. +Maybe is there another predictor $f$ which is not equal to $f^*$ risk. +\\\\ +What's the intuition? +\\Every time $A$ is such that $H_A$ is "restricted" in some sense, then $A$ cannot be consistent. (e.g $ERM$). +\\\\ +Another way of restricting? Could be tree classifiers with at most $N$ nodes (bound number of nodes). +\\ How do i know $N$ is enought to approximate well $f^*$. +I want to converge the risk of $f^*$. +\\ +We can introduce a class of algorithm potentially consistent in which space predictor is not restricted. +\\\\ +\section{Non-parametric Algorithms} +When they are potentially consistent. +\\ What does it mean?\\ +Non-parametric algorithm have the potential of being consistent and do we know if algorithm is parametric or not? +\\ +$A$ is non-parametric if: +\begin{itemize} +\item the description of $A(S_m)$ grows with $m$ +\end{itemize} +Your predictor is a function and let's assume i can store in any variable a real number with arbitrary precition. +\\\\ +\\\\ +\bred{Any algorithm with bias is incosistent. So ability to converge to base risk is this.} +\\ +How do i know if i have bias or not? this is where non parametric algorithm came. +\\ +Let's consider $\knn$, how i can describe it? I have to remember distance is maded by training points and if i give you more S the m will increase. So this is parametric. +\\ +More training set for tree, then will grow more, even more larger will be ever growing more. +\\ +Any algorithm as a give training points is no parametric, while growing with parametric will stop a some point. +\\ +--- IMMAGINE --- +\\ +If algorithm is more parametric as i give training points\\ +If a certain point stop growing, $f^*$ will be out and i will grow more. +\\ +If algorithm is able to generate +--- MANCA --- +Then the algorithm is non-parametric and can be potentially consistent and incluse $f^*$ as it grows. +\\ +If set of predictor stops because I'm not enlarging my set of predictor since description of algortim will not depend on training size at some point \qquad $\rightarrow$ to be consistent. +\\ +If bias vanashes as i increase the S, then i can be consistent. I generating predictor that description depends on how much points i give them. +\\\\ +Parametric is not precise as consistency. +\\ +One class of algorithm that has consistency has a predicotr size growing with S growing. +\\ +Definition of non parametric is more fuzzy, consistency is precise (we demonstrate that mathematically). +\\\\ +\subsection{Example of parametric algorithms} +Neural network is parametric since i give structure of the network. +If i give S small or big S my structure will be the same (will fit better on the training points). +\\ +Other example are algorithm with linear classifier in which number of parameter are just the idmension of the space. \end{document} \ No newline at end of file