0
924views
Use Taylors series method to find a solution of $\frac{dy}{dx} = 1 + y^2, y(0) = 0$ at x = 0.1 taking h = 0.1 correct to three decimal value.
1 Answer
0
0views

Solution:

$\begin{array}{cc}{\frac{d y}{d x}=1+y^{2}} & {x_{0}=0, y_{0}=0, \mathrm{h}=0.1} \\ {y^{\prime}=1+y^{2}} & {y_{0}^{\prime}=1} \\ {y^{\prime \prime}=2 y y^{\prime}} & {y_{0}^{\prime \prime}=0} \\ {y^{\prime \prime \prime}=2 y y^{\prime \prime}+2 y^{\prime} \cdot y^{\prime}} & {y_{0}^{\prime \prime \prime}=2} \\ {\text { Taylor's series is given by: }}\end{array}$

$\begin{aligned} \mathrm{y}(0.1) &=y_{0}+\boldsymbol{h} \cdot y_{0}^{\prime}+\frac{h^{2}}{2 !} y_{0}^{\prime \prime}+\cdots \\ &=0+0.1(1)+\frac{0.1 \times 0.1}{2}(0)+\frac{0.1 \times 0.1 \times 0.1}{6}(2) \end{aligned}$

$y(0.1)=0.10033$

Please log in to add an answer.