• Tidak ada hasil yang ditemukan

Taylor series

Dalam dokumen Annotated Algorithms in Python (Halaman 155-162)

Acknowledgments

4.3 Standard strategies

4.3.6 Taylor series

A function f(x):RRis said to be areal analyticalin ¯xif it is continuous inx=x¯and all its derivatives exist and are continuous inx =x.¯

When this is the case, the function can be locally approximated with a

local power series:

f(x) = f(x¯) + f(0)(x¯)(x−x¯) +· · ·+ f

(k)(x¯)

n! (x−x¯)k+Rk (4.17)

The remainderRk can be proven to be (Taylor’s theorem):

Rk= f

(k+1)(ξ)

(k+1)! (x−x¯)k+1 (4.18) where ξ is a point in betweenx and ¯x. Therefore, if f(k+1)exists and is limited within a neighborhoodD={xfor|x−x¯|<e}, then

|Rk|<maxx∈Df(k+1)

|(x−x¯)k+1| (4.19) If we stop the Taylor expansion at a finite value ofk, the preceding formula gives us the statistical error part of the computational error.

Some Taylor series are very easy to compute:

Exponential for ¯x=0:

f(x) =ex (4.20)

f(1)(x) =ex (4.21)

. . .=. . . (4.22)

f(k)(x) =ex (4.23)

ex=1+x+1

2x2+· · ·+ 1

k!xk+. . . (4.24)

Sin for ¯x=0:

f(x) =sin(x) (4.25)

f(1)(x) =cos(x) (4.26)

f(2)(x) =−sin(x) (4.27)

f(3)(x) =−cos(x) (4.28)

. . .=. . . (4.29)

sin(x) =x− 1

3!x3+· · ·+ (−1)n

(2k+1)!x(2k+1)+. . . (4.30) We can show the effects of the various terms:

Listing4.4: in file:nlib.py

1 >>> X = [0.03*i for i in xrange(200)]

2 >>> c = Canvas(title='sin(x) approximations')

3 >>> c.plot([(x,math.sin(x)) for x in X],legend='sin(x)')

4 <...>

5 >>> c.plot([(x,x) for x in X[:100]],legend='Taylor 1st')

6 <...>

7 >>> c.plot([(x,x-x**3/6) for x in X[:100]],legend='Taylor 5th')

8 <...>

9 >>> c.plot([(x,x-x**3/6+x**5/120) for x in X[:100]],legend='Taylor 5th')

10 <...>

11 >>> c.save('images/sin.png')

Notice that we can very well expand in Taylor around any other point, for example, ¯x=π/2, and we get

sin(x) =11 2(xπ

2)2+· · ·+ (−1)n (2k)! (xπ

2)(2k)+. . . (4.31) and a plot would show:

Listing4.5: in file:nlib.py

1 >>> a = math.pi/2

2 >>> X = [0.03*i for i in xrange(200)]

3 >>> c = Canvas(title='sin(x) approximations')

4 >>> c.plot([(x,math.sin(x)) for x in X],legend='sin(x)')

5 <...>

6 >>> c.plot([(x,1-(x-a)**2/2) for x in X[:150]],legend='Taylor 2nd')

7 <...>

Figure4.1: The figure shows the sin function and its approximation using the Taylor expansion aroundx=0 at different orders.

8 >>> c.plot([(x,1-(x-a)**2/2+(x-a)**4/24) for x in X[:150]], legend='Taylor 4th')

9 <...>

10 >>> c.plot([(x,1-(x-a)**2/2+(x-a)**4/24-(x-a)**6/720) for x in X[:150]],legend=' Taylor 6th')

11 <...>

12 >>> c.save('images/sin2.png')

Similarly we can expand the cos function around ¯x =0. Not accidentally, we would get the same coefficients as the Taylor expansion of the sin function around ¯x=π/2. In fact, sin(x) =cos(x−π/2):

Figure4.2: The figure shows the sin function and its approximation using the Taylor expansion aroundx=π/2 at different orders.

f(x) =cos(x) (4.32)

f(1)(x) =−sin(x) (4.33) f(2)(x) =−cos(x) (4.34)

f(3)(x) =sin(x) (4.35)

. . .=. . . (4.36)

cos(x) =1−1

2x2+· · ·+(−1)n

(2k)! x(2k)+. . . (4.37) With a simple replacement, it is easy to prove that

eix=cos(x) +isin(x) (4.38) which will be useful when we talk about Fourier and Laplace transforms.

Now let’s consider thekth term in Taylor expansion ofex. It can be rear-

ranged as a function of the previous(k−1)−thterm:

Tk(x) = 1 k!xn= x

k 1 (k−1)!x

k−1= x

kTk−1(x) (4.39) For x<0, the terms in the sign have alternating sign and are decreasing in magnitude; therefore, forx <0,Rk <Tk+1(1). This allows for an easy implementation of the Taylor expansion and its stopping condition:

Listing4.6: in file:nlib.py

1 def myexp(x,precision=1e-6,max_steps=40):

2 if x==0:

3 return1.0

4 elif x>0:

5 return1.0/myexp(-x,precision,max_steps)

6 else:

7 t = s = 1.0 # first term

8 for k in xrange(1,max_steps):

9 t = t*x/k # next term

10 s = s + t # add next term

11 if abs(t)<precision: return s

12 raiseArithmeticError('no convergence')

This code presents all the features of many of the algorithms we see later in the chapter:

• It deals with the special casee0=1.

• It reduces difficult problems to easier problems (exponential of a posi- tive number to the exponential of a negative number viaex=1/e−x).

• It approximates the “true” solution by iterations.

• The max number of iterations is limited.

• There is a stopping condition.

• It detects failure to converge.

Here is a test of its convergence:

Listing4.7: in file:nlib.py

1 >>> for i in xrange(10):

2 ... x= 0.1*i

3 ... assertabs(myexp(x) - math.exp(x)) < 1e-4

We can do the same for the sin function:

Tk(x) =− x

2

(2k)(2k+1)Tk−1(x) (4.40) In this case, the residue is always limited by

|Rk|<|x2k+1| (4.41) because the derivatives of sin are always sin and cos and their image is always between [−1,1].

Also notice that the stopping condition is only true when 0 ≤ x < 1.

Therefore, for other values of x, we must use trigonometric relations to reduce the problem to a domain where the Taylor series converges.

Hence we write:

Listing4.8: in file:nlib.py

1 def mysin(x,precision=1e-6,max_steps=40):

2 pi = math.pi

3 if x==0:

4 return 0

5 elif x<0:

6 return -mysin(-x)

7 elif x>2.0*pi:

8 return mysin(x % (2.0*pi))

9 elif x>pi:

10 return -mysin(2.0*pi - x)

11 elif x>pi/2:

12 return mysin(pi-x)

13 elif x>pi/4:

14 return sqrt(1.0-mysin(pi/2-x)**2)

15 else:

16 t = s = x # first term

17 for k in xrange(1,max_steps):

18 t = t*(-1.0)*x*x/(2*k)/(2*k+1) # next term

19 s = s + t # add next term

20 r = x**(2*k+1) # estimate residue

21 if r<precision: returns # stopping condition

22 raiseArithmeticError('no convergence')

Here we test it:

Listing4.9: in file:nlib.py

1 >>> for i in xrange(10):

2 ... x= 0.1*i

3 ... assertabs(mysin(x) - math.sin(x)) < 1e-4

Finally, we can do the same for the cos function:

Listing4.10: in file:nlib.py

1 def mycos(x,precision=1e-6,max_steps=40):

2 pi = math.pi

3 if x==0:

4 return1.0

5 elif x<0:

6 returnmycos(-x)

7 elif x>2.0*pi:

8 returnmycos(x % (2.0*pi))

9 elif x>pi:

10 returnmycos(2.0*pi - x)

11 elif x>pi/2:

12 return-mycos(pi-x)

13 elif x>pi/4:

14 returnsqrt(1.0-mycos(pi/2-x)**2)

15 else:

16 t = s = 1 # first term

17 for k in xrange(1,max_steps):

18 t = t*(-1.0)*x*x/(2*k)/(2*k-1) # next term

19 s = s + t # add next term

20 r = x**(2*k) # estimate residue

21 if r<precision: returns # stopping condition

22 raiseArithmeticError('no convergence')

Here is a test of convergence:

Listing4.11: in file:nlib.py

1 >>> for i in xrange(10):

2 ... x = 0.1*i

3 ... assertabs(mycos(x) - math.cos(x)) < 1e-4

Dalam dokumen Annotated Algorithms in Python (Halaman 155-162)