Neural Network Theory
Neural Network Theory
项目类别:数学

Neural Network Theory

Create an RMarkdown file and turn in both the knitted version (as HTML) and the .Rmd file. Your file should answer the following questions: Empirical Chain Rule: (16 pts) 1. Create a function `p` with the following features: p must calculate a degree 5 polynomial p must have two local maximas and two local minimas between x=-2 and x=5 p must be appropriately vectorized 2. Create a function `q` with the following features: q must calculate a degree 3 polynomial q must have one maxim and one minima between x=0 and x=1 q must have a range between y=-2 and y=5 over the interval [0,1] 3. Create a function p.q which is the composition p(q(x)) 4. Create the function `D` (see the notes) that approximates the derivative. Use h=0.0001 5. Use D to create the functions: 1. Dp 2. Dq 3. Dp.q 6. Produce a graph with the following properties: Must show `p` between x=-2 and x=5 [updated] Must show `Dp` between x=-2 and x=5 [updated] The ylim() should be enough to show both `p` and `Dp` clearly. Add dotted vertical lines at the zero's of `Dp` The x-axis and y-axis should be appropriately labelled, and the graph should have a title (hint: `main=""`) 7. Produce a second graph with the following properties: Must show `q` between x=0 and x=1 Must show `Dq` between x=0 and x=1 The ylim() should be enough to show both `q` and `Dq` clearly. Add dotted vertical lines at the zero's of `Dq` The x-axis and y-axis should be appropriately labelled, and the graph should have a title (hint: `main=""`)  Neural Network Theory . Produce a third graph with the following properties: Must show `p.q` between x=0 and x=1 Must show `Dp.q` between x=0 and x=1 Add a curve for `Dq(x)*Dp(q(x))+offset` where `offset` is chosen to make it clear the the graph of Dp.q and your curve are the same shape. The ylim() should be enough to show all details clearly Add dotted vertical lines at the zero's of `Dp.q` The x-axis and y-axis should be appropriately labelled, and the graph should have a title (hint: `main=""`) Neural Network I (bounded activation functions) (5 pts) A function is bounded if there is a lower-limit and an upper-limit to the possible values it can output. For example, the logistic function is bounded because it can only returns values in the interval (0,1). In one or two paragraphs discuss why a neural net that contains a hidden layer whose every node utilizes a bounded activation function must have both an upper and lower bound on the possible values of its output nodes. Explain why this makes it impossible for a neural network using the logistic function as an activation function for it's hidden nodes can never produce a perfect approximation to the function f(x)=x . Neural Network II (RELU) (5 pts) Consider a neural network that uses RELU for all of its hidden nodes. Explain why the function encoded by such a network is necessarily linear for sufficiently large input values. Explain the implications for emulating f(x)=x .

留学ICU™️ 留学生辅助指导品牌
在线客服 7*24 全天为您提供咨询服务
咨询电话(全球): +86 17530857517
客服QQ:2405269519
微信咨询:zz-x2580
关于我们
微信订阅号
© 2012-2021 ABC网站 站点地图:Google Sitemap | 服务条款 | 隐私政策
提示:ABC网站所开展服务及提供的文稿基于客户所提供资料,客户可用于研究目的等方面,本机构不鼓励、不提倡任何学术欺诈行为。