dissertation_work/Dissertation/front_matter.tex

142 lines
5.9 KiB
TeX
Raw Normal View History

2024-03-12 00:41:42 +00:00
\begin{singlespace}
\begin{center}
Artificial Neural Networks Applied to Stochastic Monte Carlo as a Way to Approximate Modified Heat Equations, and Their Associated Parameters, Depths, and Accuracies.
\end{center}
\vspace{0.5cm}
\begin{center}
A dissertation submitted in partial fulfillment \\
of the requirements for the degree of \\
Doctor of Philosophy in Mathematics
\end{center}
\vspace{1cm}
\begin{center}
by
\end{center}
\vspace{0.5cm}
\begin{center}
Shakil Ahmed Rafi \\
Troy University \\
Bachelor of Science in Mathematics, 2015 \\
University of Arkansas \\
Master of Science in Mathematics, 2019
\end{center}
\vspace{0.5cm}
\begin{center}
May 2024 \\
University of Arkansas
\end{center}
\vspace{0.5cm}
This dissertation is approved for recommendation to the Graduate Council.
\vspace{1.5cm}
\begin{center}
\noindent\hspace*{0cm}\rule{7cm}{0.7pt} \\
Joshua Lee Padgett, Ph.D.\\
Dissertation Director
\end{center}
\vspace{1cm}
\begin{minipage}{0.5\textwidth}
\begin{center}
\noindent\hspace*{0cm}\rule{7cm}{0.7pt} \\
Ukash Nakarmi, Ph.D.\\
Committee Member
\end{center}
\end{minipage}
\begin{raggedleft}
\begin{minipage}{0.5\textwidth}
\begin{center}
\noindent\hspace*{0cm}\rule{7cm}{0.7pt} \\
Jiahui Chen, Ph.D.\\
Committee Member
\end{center}
\end{minipage}
\end{raggedleft}
\vspace{0.5cm}
\begin{center}
\noindent\hspace*{0cm}\rule{7cm}{0.7pt} \\
Tulin Kaman, Ph.D.\\
Committee Member
\end{center}
\vspace{1cm}
\newpage
\begin{center}
\textbf{Abstract}
\end{center}
This dissertation seeks to explore a certain calculus for artificial neural networks. Specifically we will be looking at versions of the heat equation, and exploring strategies on how to approximate them.
\\~\\
Our strategy towards the beginning will be to take a technique called Multi-Level Picard (MLP), and present a simplified version of it showing that it converges to a solution of the equation $\lp \frac{\partial}{\partial t}u_d\rp\lp t,x\rp = \lp \nabla^2_x u_d\rp\lp t,x\rp$.
\\~\\
We will then take a small detour exploring the viscosity super-solution properties of solutions to such equations. It is here that we will first encounter Feynman-Kac, and see that solutions to these equations can be expressed the expected value of a certain stochastic integral.
\\~\\
The final and last part of the dissertation will be dedicated to expanding a certain neural network framework. We will build on this framework by introducing new operations, namely raising to a power, and use this to build out neural network polynomials. This opens the gateway for approximating transcendental functions such as $\exp\lp x\rp,\sin\lp x\rp$, and $\cos\lp x\rp$. This, coupled with a trapezoidal rule mechanism for integration allows us to approximate expressions of the form $\exp \lp \int_a^b \square dt\rp$.
\\~\\
We will, in the last chapter, look at how the technology of neural networks developed in the previous two chapters work towards approximating the expression that Feynman-Kac asserts must be the solution to these modified heat equations. We will then end by giving approximate bounds for the error in the Monte Carlo method. All the while we will maintain that the parameter estimates and depth estimates remain polynomial on $\frac{1}{\ve}$.
\\~\\
As an added bonus we will also look at the simplified MLP technque from the previous chapters of this dissertation and show that yes, they can indeed be approximated with artificial neural networks, and that yes, they can be done so with neural networks whose parameters and depth counts grow only polynomially on $\frac{1}{\ve}$.
\\~\\
Our appendix will contain code listings of these neural network operations, some of the architectures, and some small scale simulation results.
\newpage
\begin{center}
\vspace*{\fill}
\copyright 2024 Shakil Ahmed Rafi \\
All rights reserved.
\vspace*{\fill}
\end{center}
\newpage
\begin{center}
\textbf{Acknowledgements}
\end{center}
I would like to acknowledge my advisor Dr. Joshua Padgett who has been instrumental in me Ph.D. journey. I am incredibly thankful for him taking the time out of his busy schedule to meet with me over the weekends and helping me finish my dissertation. Without his help, guidance, and patience I would never have been where I am today. You not only taught me mathematics, but also how to be a mathematician. Thank you.
\\~\\
I would also like to thank my department, and everyone there, including, but not limited to Dr. Andrew Raich, for his incredible patience and helpful guidance throughout the years. I would also like to thank Dr. Ukash Nakarmi for the excellent collaboartions I've had. I would also to Egan Meaux for all the little things he does to keep the department going.
\\~\\
I would like to acknowledge Marufa Mumu for believing in me when I didn't. You really made the last few months of writing this dissertation, less painful.
\\~\\
I would like to acknowledge my cat, a beautiful Turkish Angora, Tommy. He was pretty useless, but stroking his fur made me stress a little less.
\\~\\
Finally, I would like to thank Valetta Ventures, Inc. and their product Texifier. It is marvel of software engineering and made the process of creating this dissertation much less painful than it already was.
\newpage
\begin{center}
\vspace*{\fill}
Dedicated to my grandparents, \\
M.A. Hye, M.A., \& Nilufar Hye\\
who would've love to see this but can't; \\
to my parents, \\
Kamal Uddin Ahmed, M.A., \& Shahnaz Parveen, M.A.,\\
who kept faith in me, always; \\
and finally to my brothers, \\
Wakil Ahmed Shabi, BBA \& Nabeel Ahmed Sami, B.Eng., \\
for whom I have been too imperfect a role model.\\
\vspace*{\fill}
\end{center}
\newpage
\begin{center}
\vspace*{\fill}
\textit{Read, in the name of your Lord}\\
\textemdash Surah Al-Alaq:1\\~\\
\textit{The conquest of nature must be achieved with number and measure.} \\
\textemdash Ren\'e Descartes \\
\vspace*{\fill}
\end{center}
\newpage
\end{singlespace}