Welcome to Gaia! ::

The Physics and Mathematics Guild

Back to Guilds

 

Tags: physics, mathematics, science, universe 

Reply The Physics and Mathematics Guild
Integrals.

Quick Reply

Enter both words below, separated by a space:

Can't read the text? Click here

Submit

Suicidesoldier#1

Fanatical Zealot

PostPosted: Sun Apr 17, 2011 9:06 pm
Okay. I admit. I suck at them (Ha, I know right?!).

The problem is, I need to learn how to do them. I've got the basic, X^N+1/N +1, so the Integral of of X^2 would be X^3/3 and the derivative would be 2X.

I also know that integrals are the anti-derative, and that they essentially bump something "Up an X", or up a variable. Addition is the base, then multiplication is one step ahead, then exponents, and finally up variables (at least in my mind).

While the algorithm for addition and multiplication are relatively simple, because integrals are up a variable, it always varies, and thus requires a somewhat more complicated algorithm that includes a changing set of parameters (not that strange, actually, as all of them essentially require different variables, and squaring something is essentially taking itself, or X, and multiplying it by itself).




Kay, got all the theories down, but I can't actually do them. It's a bunch of random equations; why? I understand that calculators can do them, so isn't there a base algorithm for all this, even if it IS super complicated? A sophomore's dream, perhaps?

Or is there simply a set of equations for them, and if so, is there an easy way to remember all of it? I know that their are lists, but a lot of time I can't even understand what it's trying to say.

And, over your years, have you begun to understand them or found some fancy tricks?

I don't know, it's all very complicated to me, and I really want to understand them. xp  
PostPosted: Mon Apr 18, 2011 6:42 pm
I'm not sure what you're asking. Integration software works primarily through two types of steps: pattern matching with a table and some variation of the Risch algorithm. The latter has some simple and elegant results, but its general implementation is so complicated that if you're evaluating things by hand and are willing and capable enough to learn and use it, then your efforts would be much better served by practing with the ordinary methods used in calculus textbooks.

The effort required to actually use something like the Risch algorithm with pen and paper is larger than just learning the standard methods; its only advantage is that it's a genuine algorithm. Plus, you just won't learn much if you spend time on it at the expense of actual calculus.

The most powerful pattern matching is integration by substitution. Basically, this is pattern matching to be a function of variable times its derivative: Int[ F(f(x)) f'(x) dx ] = Int[ F(f) df ]. It works forwards or backwards (' denotes derivative). Examples:
1) Int[ xe^{x²} dx ] = (1/2) Int[ e^{x²} [x²]' dx ] = (1/2) e^{x²} + C
2) Int[ √(1-x²) dx ] = Int[ √(1-f²) df ] = Int[ √(1-sin²x) d(sin x) ] = Int[ cos²x dx ], which can be done easily by power reduction identity.
Although the traditional substitution variable is u.

Suicidesoldier#1
While the algorithm for addition and multiplication are relatively simple, because integrals are up a variable, it always varies, and thus requires a somewhat more complicated algorithm that includes a changing set of parameters (not that strange, actually, as all of them essentially require different variables, and squaring something is essentially taking itself, or X, and multiplying it by itself).

I'm not sure what you're referring to. There is no general formula for the integral of a product of two functions. On the other hand, there is a nice relationship for the product of a function and a derivative of a function: integration by parts. Basically, since derivaties have the product rule
(fg)' = f'g + fg',
integrating both sides and rearranging terms gives
Int[ fg' ] = fg - Int[ f'g ]
A kind of trivial example might make use of x' = 1:
3) Int[ ln(x) dx ] = Int[ x'ln(x) dx ] = xln(x) - Int[ x*(1/x) dx ] = xln(x) - x.

This is all going to be in every calculus book, so I won't bother going too in-depth. Though if something in the book you're reading confuses you, feel free to ask.

Suicidesoldier#1
And, over your years, have you begun to understand them or found some fancy tricks?

There are some fancy tricks applicable in specific cases, fancy in the sense of fancier than you learn in calculus~e.g., there sometimes ways of turning a real integral into a countour integral in the complex plane, and contour integration in the complex plane is often simpler. But such things are not going to help you much if you don't know the techniques of basic calculus. You must understand them to progress.

Integrals, when they exist, typically are never conceptually difficult. However, they can involve rather complicated algebraic gymnastics, and thus be practically very tiresome.

My advice to you is to practice. Go through the problems in your book, see which ones look difficult, and try them. Feel free to ask if you're stuck.
Also a good idea is to go through the table of integrals found in most textbooks, and actually try to prove them. Even if you intend to memorize them, that task would be much easier if you understand why they are true. And if you don't intend to memorize them, then at least it'll be good practice as well.  

VorpalNeko
Captain


Suicidesoldier#1

Fanatical Zealot

PostPosted: Sat May 07, 2011 8:25 am
VorpalNeko
I'm not sure what you're asking. Integration software works primarily through two types of steps: pattern matching with a table and some variation of the Risch algorithm. The latter has some simple and elegant results, but its general implementation is so complicated that if you're evaluating things by hand and are willing and capable enough to learn and use it, then your efforts would be much better served by practing with the ordinary methods used in calculus textbooks.

The effort required to actually use something like the Risch algorithm with pen and paper is larger than just learning the standard methods; its only advantage is that it's a genuine algorithm. Plus, you just won't learn much if you spend time on it at the expense of actual calculus.

The most powerful pattern matching is integration by substitution. Basically, this is pattern matching to be a function of variable times its derivative: Int[ F(f(x)) f'(x) dx ] = Int[ F(f) df ]. It works forwards or backwards (' denotes derivative). Examples:
1) Int[ xe^{x²} dx ] = (1/2) Int[ e^{x²} [x²]' dx ] = (1/2) e^{x²} + C
2) Int[ √(1-x²) dx ] = Int[ √(1-f²) df ] = Int[ √(1-sin²x) d(sin x) ] = Int[ cos²x dx ], which can be done easily by power reduction identity.
Although the traditional substitution variable is u.

Suicidesoldier#1
While the algorithm for addition and multiplication are relatively simple, because integrals are up a variable, it always varies, and thus requires a somewhat more complicated algorithm that includes a changing set of parameters (not that strange, actually, as all of them essentially require different variables, and squaring something is essentially taking itself, or X, and multiplying it by itself).

I'm not sure what you're referring to. There is no general formula for the integral of a product of two functions. On the other hand, there is a nice relationship for the product of a function and a derivative of a function: integration by parts. Basically, since derivaties have the product rule
(fg)' = f'g + fg',
integrating both sides and rearranging terms gives
Int[ fg' ] = fg - Int[ f'g ]
A kind of trivial example might make use of x' = 1:
3) Int[ ln(x) dx ] = Int[ x'ln(x) dx ] = xln(x) - Int[ x*(1/x) dx ] = xln(x) - x.

This is all going to be in every calculus book, so I won't bother going too in-depth. Though if something in the book you're reading confuses you, feel free to ask.

Suicidesoldier#1
And, over your years, have you begun to understand them or found some fancy tricks?

There are some fancy tricks applicable in specific cases, fancy in the sense of fancier than you learn in calculus~e.g., there sometimes ways of turning a real integral into a countour integral in the complex plane, and contour integration in the complex plane is often simpler. But such things are not going to help you much if you don't know the techniques of basic calculus. You must understand them to progress.

Integrals, when they exist, typically are never conceptually difficult. However, they can involve rather complicated algebraic gymnastics, and thus be practically very tiresome.

My advice to you is to practice. Go through the problems in your book, see which ones look difficult, and try them. Feel free to ask if you're stuck.
Also a good idea is to go through the table of integrals found in most textbooks, and actually try to prove them. Even if you intend to memorize them, that task would be much easier if you understand why they are true. And if you don't intend to memorize them, then at least it'll be good practice as well.


Herm... very interesting.

Well, my understanding is that unlike most algorithms, because integrals are a step higher above linear and even exponentiation equations, the algorithm for them is "mutating", or dependent on the environment.


While it's always dependent on the variable, such as X^2 being dependent on X, the algorithms and logarithms for finding out an integral changes with the variable.

Which is why there is no singular equation.



A step above integrals would be changing around math itself, or possibly something affecting real life (lol).  
PostPosted: Sun May 15, 2011 5:36 pm
Suicidesoldier#1
Well, my understanding is that unlike most algorithms, because integrals are a step higher above linear and even exponentiation equations, the algorithm for them is "mutating", or dependent on the environment.

An integral actually is a linear equation, because both differential and integral operators are linear:
Int[ (αf(x) + βg(x)) dx ] = α.Int[f(x)dx] + β.Int[g(x)dx], for any constants {α,β} and functions {f,g}
In terms of differential equations, solving an integral is equivalent to solving the simplest subtype of what are called linear differential equations.

The difficulty comes in not from non-linearity, but rather the fact that it's a nontrivial linear operator, rather than something as simple as multiplication by a constant.

Suicidesoldier#1
While it's always dependent on the variable, such as X^2 being dependent on X, the algorithms and logarithms for finding out an integral changes with the variable.

It's not so much the change that's the problem, but the trouble of figuring out which change is advantageous. The name of the variable does not matter. For example, look at the previous case:
a) Int[ xe^{x²} dx ] = (1/2) Int[ e^{x²} [x²]' dx ]
In terms of the differential operator, we can write d(x²) = 2x dx, so this can equally well be written as:
b) Int[ xe^{x²} dx ] = (1/2) Int[ e^{x²} d(x²) ]
What's going on is that now we can treat x² as single variable, since we're integrating with respect to x² rather than x. So the old rule of exponential function being its own (anti)derivative applies: the answer is (1/2)e^{x²} + C.

Suicidesoldier#1
Which is why there is no singular equation.

Yeah, there's no single equation. Or even a multi-part one. An interesting (and not directly related) example of something that's easy to calculate but not easy to invert is that although polynomials are simple to calculate, polynomials of degree higher than four (provably) cannot have a formula made only of arithmetic operations and roots--unlike, say, the quadratic formula, which solves any polynomial of second degree.

(Of course there's really tons of examples. Multiplication is easy; factoring is hard. Etc. But the quintic polynomial is particularly famous as far as formula-searching goes.)  

VorpalNeko
Captain

Reply
The Physics and Mathematics Guild

 
Manage Your Items
Other Stuff
Get GCash
Offers
Get Items
More Items
Where Everyone Hangs Out
Other Community Areas
Virtual Spaces
Fun Stuff
Gaia's Games
Mini-Games
Play with GCash
Play with Platinum