Why would one write a vector field as a derivative?
Clash Royale CLAN TAG#URR8PPP
up vote
18
down vote
favorite
I thought a vector field was a function $vecA:mathbbR^ntomathbbR^n$, which takes a vector to a vector. This seemed intuitive to me, but in a mathematical physics course I found the definition
$$X=sumlimits_i=1^n X^i fracpartialpartial x^i,$$
where $X$ seems to take functions from $C^infty$ as argument and $X^i:mathbbR^ntomathbbR$. I don't understand this new way of talking about vector fields, especially when our professor said we should see the $fracpartialpartial x^i$ as basis vectors of the tangent space.
I also can't really translate any vector field which I know into this new definition. So it would be great if someone could maybe make an example of an $vecE$-field or something similar and "translate" it into this new way of expressing vector fields. I'm also struggling to visualize the new definition. It seems that $X(f):mathbbR^nto mathbbR$ (for $fin C^infty$), to which I really can't assign any physical meaning.
TL;DR: Can someone explain to me why someone would want to express a vector field in this way? Ideally I'd like to see an example of the same vector field expressed in the two forms and an explanation how one would interpret the visualization of the new vector field.
differential-geometry vector-fields
add a comment |Â
up vote
18
down vote
favorite
I thought a vector field was a function $vecA:mathbbR^ntomathbbR^n$, which takes a vector to a vector. This seemed intuitive to me, but in a mathematical physics course I found the definition
$$X=sumlimits_i=1^n X^i fracpartialpartial x^i,$$
where $X$ seems to take functions from $C^infty$ as argument and $X^i:mathbbR^ntomathbbR$. I don't understand this new way of talking about vector fields, especially when our professor said we should see the $fracpartialpartial x^i$ as basis vectors of the tangent space.
I also can't really translate any vector field which I know into this new definition. So it would be great if someone could maybe make an example of an $vecE$-field or something similar and "translate" it into this new way of expressing vector fields. I'm also struggling to visualize the new definition. It seems that $X(f):mathbbR^nto mathbbR$ (for $fin C^infty$), to which I really can't assign any physical meaning.
TL;DR: Can someone explain to me why someone would want to express a vector field in this way? Ideally I'd like to see an example of the same vector field expressed in the two forms and an explanation how one would interpret the visualization of the new vector field.
differential-geometry vector-fields
7
Would Mathematics be a better home for this question?
â Qmechanicâ¦
Aug 13 at 17:39
Quite similar question here on Math.SE. Note that this representation of a vector field is not "as a derivative", but "as a differential operator".
â Henning Makholm
Aug 13 at 23:09
1
related/possible duplicate: Inconsistency with partial derivatives as basis vectors?.
â AccidentalFourierTransform
Aug 13 at 23:17
add a comment |Â
up vote
18
down vote
favorite
up vote
18
down vote
favorite
I thought a vector field was a function $vecA:mathbbR^ntomathbbR^n$, which takes a vector to a vector. This seemed intuitive to me, but in a mathematical physics course I found the definition
$$X=sumlimits_i=1^n X^i fracpartialpartial x^i,$$
where $X$ seems to take functions from $C^infty$ as argument and $X^i:mathbbR^ntomathbbR$. I don't understand this new way of talking about vector fields, especially when our professor said we should see the $fracpartialpartial x^i$ as basis vectors of the tangent space.
I also can't really translate any vector field which I know into this new definition. So it would be great if someone could maybe make an example of an $vecE$-field or something similar and "translate" it into this new way of expressing vector fields. I'm also struggling to visualize the new definition. It seems that $X(f):mathbbR^nto mathbbR$ (for $fin C^infty$), to which I really can't assign any physical meaning.
TL;DR: Can someone explain to me why someone would want to express a vector field in this way? Ideally I'd like to see an example of the same vector field expressed in the two forms and an explanation how one would interpret the visualization of the new vector field.
differential-geometry vector-fields
I thought a vector field was a function $vecA:mathbbR^ntomathbbR^n$, which takes a vector to a vector. This seemed intuitive to me, but in a mathematical physics course I found the definition
$$X=sumlimits_i=1^n X^i fracpartialpartial x^i,$$
where $X$ seems to take functions from $C^infty$ as argument and $X^i:mathbbR^ntomathbbR$. I don't understand this new way of talking about vector fields, especially when our professor said we should see the $fracpartialpartial x^i$ as basis vectors of the tangent space.
I also can't really translate any vector field which I know into this new definition. So it would be great if someone could maybe make an example of an $vecE$-field or something similar and "translate" it into this new way of expressing vector fields. I'm also struggling to visualize the new definition. It seems that $X(f):mathbbR^nto mathbbR$ (for $fin C^infty$), to which I really can't assign any physical meaning.
TL;DR: Can someone explain to me why someone would want to express a vector field in this way? Ideally I'd like to see an example of the same vector field expressed in the two forms and an explanation how one would interpret the visualization of the new vector field.
differential-geometry vector-fields
differential-geometry vector-fields
edited Aug 13 at 18:00
knzhou
33.9k897170
33.9k897170
asked Aug 13 at 17:37
Sito
956
956
7
Would Mathematics be a better home for this question?
â Qmechanicâ¦
Aug 13 at 17:39
Quite similar question here on Math.SE. Note that this representation of a vector field is not "as a derivative", but "as a differential operator".
â Henning Makholm
Aug 13 at 23:09
1
related/possible duplicate: Inconsistency with partial derivatives as basis vectors?.
â AccidentalFourierTransform
Aug 13 at 23:17
add a comment |Â
7
Would Mathematics be a better home for this question?
â Qmechanicâ¦
Aug 13 at 17:39
Quite similar question here on Math.SE. Note that this representation of a vector field is not "as a derivative", but "as a differential operator".
â Henning Makholm
Aug 13 at 23:09
1
related/possible duplicate: Inconsistency with partial derivatives as basis vectors?.
â AccidentalFourierTransform
Aug 13 at 23:17
7
7
Would Mathematics be a better home for this question?
â Qmechanicâ¦
Aug 13 at 17:39
Would Mathematics be a better home for this question?
â Qmechanicâ¦
Aug 13 at 17:39
Quite similar question here on Math.SE. Note that this representation of a vector field is not "as a derivative", but "as a differential operator".
â Henning Makholm
Aug 13 at 23:09
Quite similar question here on Math.SE. Note that this representation of a vector field is not "as a derivative", but "as a differential operator".
â Henning Makholm
Aug 13 at 23:09
1
1
related/possible duplicate: Inconsistency with partial derivatives as basis vectors?.
â AccidentalFourierTransform
Aug 13 at 23:17
related/possible duplicate: Inconsistency with partial derivatives as basis vectors?.
â AccidentalFourierTransform
Aug 13 at 23:17
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
30
down vote
accepted
The motivation goes like this.
- When we define things mathematically, we want to use as few separate objects as possible. We don't want to define a new object independently if it can be defined in terms of existing things.
- Suppose a particle moves so that when it is at position $mathbfr$, its velocity is $mathbfv(mathbfr)$, where $mathbfv$ is a vector field. Then if there is some function $f(mathbfr)$, then the particle sees
$$fracdfdt = v^i fracpartial fpartial x^i$$
by the chain rule. That is, if we interpret a vector field as a velocity field, and give it a function $f$, then we can compute another function $df/dt$, which is the rate of change of $f$ seen by a particle following the flow of the vector field, as if it were a velocity field. - By glancing at the chain rule, you see that if you know $df/dt$ for every $f$, then you know what the vector field is.
- Hence, when we work in the more general setting of a manifold, where it's not immediately clear how to define a vector field in the usual way ("an arrow at every point"), we can use this in reverse to define what a vector field is. That is, a vector field $v$ is a map of functions $f mapsto v(f)$ which obeys certain properties.
- Note that not every vector field should be physically regarded as a velocity field. We're just making mathematical definitions here. The definitions are chosen to make the formalism as clean and simple as possible, possibly at the expense of intuition.
- Translating between the two is very simple. For example, the field
$$mathbfE(mathbfr) = x hati + xy hatj$$
translates to
$$mathbfE(mathbfr) = x fracpartialpartial x + x y fracpartialpartial y.$$
That is, whenever you see $partial/partial x$ you can just imagine it as a unit vector pointing in the $x$ direction. The intuition is the same, because the two definitions obey the same properties.
There are many examples of this in mathematics. For example, you might think $log(x)$ is defined as "the number of times you have to multiple by $e$ to get to $x$", but it's unclear how to rigorously define that. Then, through semi-rigorous manipulation you can show that
$$log(x) = int_1^x fracdx'x'.$$
Now the mathematician sees this and chooses to define $log(x)$ as this integral. This is simpler, because it automatically works for any real $x$ and uses only the notion of an integral, which we already know. Then one can derive the "intuitive" properties of the logarithm, such as $log(xy) = log(x) + log(y)$. By using a less intuitive definition, the formalism becomes simpler. And once you show that this definition is equivalent to the intuitive one, you are "allowed" to just keep on using the same intuition you started with, so you get the best of both worlds!
Edit: the OP asks for examples where this definition of a vector field is more practically useful. I can think of two off the top of my head. First, how do basis vectors transform when you change coordinates from $x^i$ to $y^j$? In the usual formalism you may have to memorize a formula, but with the derivatives it follows from the chain rule,
$$fracpartialpartial x^i = fracpartial y^jpartial x^i fracpartialpartial y^j equiv J^j_i fracpartialpartial y^j$$
where $J$ is the Jacobian matrix. Next, suppose the vector field actually is a velocity field, and you want to calculate $f(x(t))$ given $x(0)$, i.e. you want to know where you'll be if you follow the flow for time $t$. In this formalism, that's a one-liner. It's just
$$f(x(t)) = (e^t v f(x))|_x = x(0).$$
To prove this, expand the exponential in a Taylor series.
Of course, the great thing is that once you prove two formalisms are equivalent, you can use the intuition from either one interchangably, because you know they're both equally valid. So you gain intuition for a few new cases, without losing any intuition you had before. You can always change back and forth, much like the same software program can run on different hardware.
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
add a comment |Â
up vote
4
down vote
A one-line motivation is as follows:
You can identify a vector (field) with the "directional derivative" along that vector (field).
Given a point and a vector at that point, you can (try to) differentiate a function at that point in that direction.
In coordinates, the relation between your $X$ and your $vecA=sum_i=1^n A_i vece_i$ is
$$ X= vecA cdot nabla = sum_i=1^n A_i fracpartialpartial x_i.$$
A vector (field) is a "dynamical" object, a way of "transforming space": take the ODE, associated with $vecA$, i.e. $boldsymbolr'= vecA(boldsymbolr)$, and look at the flow. Once you have the flow, you can differentiate your functions along it.
add a comment |Â
up vote
1
down vote
This is really a mathematics question and not a physics question.
It's a question of how we think about the derivative. There is more than one way. And this is often how progress is made, by making progress on more than one front.
Newton had a dynamic conception of the rate of change which he mathematically implemented through an intuitive notion of infinitesimals - aka his method of fluxions. By this method the notion of a derivation is a derived concept.
This is a natural concept to come up with. After all, once we have the notion of a derivative to hand, a natural question to ask is their a simple relationship between the derivative of a product and the derivatives of the factors of a product. There is, ie:
$d(gf) = dg.f + f.dg$
When formalised, this is the notion of a derivation.
Mathematically speaking, we think of the derivative as defined by a limit - taking our cue from Cauchy.
However, one can take the derived concept of the derivation as another point of departure for the calculus and this gives us an algebraic way of thinking about derivatives. And this gives a cleaner development of the main tools of the calculus, at least in the finite-dimensional context.
By the way, the usual definition of a vector field is as section of a vector bundle. This simply means that to every point of manifold a vector is attached in a continuous fashion.
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
30
down vote
accepted
The motivation goes like this.
- When we define things mathematically, we want to use as few separate objects as possible. We don't want to define a new object independently if it can be defined in terms of existing things.
- Suppose a particle moves so that when it is at position $mathbfr$, its velocity is $mathbfv(mathbfr)$, where $mathbfv$ is a vector field. Then if there is some function $f(mathbfr)$, then the particle sees
$$fracdfdt = v^i fracpartial fpartial x^i$$
by the chain rule. That is, if we interpret a vector field as a velocity field, and give it a function $f$, then we can compute another function $df/dt$, which is the rate of change of $f$ seen by a particle following the flow of the vector field, as if it were a velocity field. - By glancing at the chain rule, you see that if you know $df/dt$ for every $f$, then you know what the vector field is.
- Hence, when we work in the more general setting of a manifold, where it's not immediately clear how to define a vector field in the usual way ("an arrow at every point"), we can use this in reverse to define what a vector field is. That is, a vector field $v$ is a map of functions $f mapsto v(f)$ which obeys certain properties.
- Note that not every vector field should be physically regarded as a velocity field. We're just making mathematical definitions here. The definitions are chosen to make the formalism as clean and simple as possible, possibly at the expense of intuition.
- Translating between the two is very simple. For example, the field
$$mathbfE(mathbfr) = x hati + xy hatj$$
translates to
$$mathbfE(mathbfr) = x fracpartialpartial x + x y fracpartialpartial y.$$
That is, whenever you see $partial/partial x$ you can just imagine it as a unit vector pointing in the $x$ direction. The intuition is the same, because the two definitions obey the same properties.
There are many examples of this in mathematics. For example, you might think $log(x)$ is defined as "the number of times you have to multiple by $e$ to get to $x$", but it's unclear how to rigorously define that. Then, through semi-rigorous manipulation you can show that
$$log(x) = int_1^x fracdx'x'.$$
Now the mathematician sees this and chooses to define $log(x)$ as this integral. This is simpler, because it automatically works for any real $x$ and uses only the notion of an integral, which we already know. Then one can derive the "intuitive" properties of the logarithm, such as $log(xy) = log(x) + log(y)$. By using a less intuitive definition, the formalism becomes simpler. And once you show that this definition is equivalent to the intuitive one, you are "allowed" to just keep on using the same intuition you started with, so you get the best of both worlds!
Edit: the OP asks for examples where this definition of a vector field is more practically useful. I can think of two off the top of my head. First, how do basis vectors transform when you change coordinates from $x^i$ to $y^j$? In the usual formalism you may have to memorize a formula, but with the derivatives it follows from the chain rule,
$$fracpartialpartial x^i = fracpartial y^jpartial x^i fracpartialpartial y^j equiv J^j_i fracpartialpartial y^j$$
where $J$ is the Jacobian matrix. Next, suppose the vector field actually is a velocity field, and you want to calculate $f(x(t))$ given $x(0)$, i.e. you want to know where you'll be if you follow the flow for time $t$. In this formalism, that's a one-liner. It's just
$$f(x(t)) = (e^t v f(x))|_x = x(0).$$
To prove this, expand the exponential in a Taylor series.
Of course, the great thing is that once you prove two formalisms are equivalent, you can use the intuition from either one interchangably, because you know they're both equally valid. So you gain intuition for a few new cases, without losing any intuition you had before. You can always change back and forth, much like the same software program can run on different hardware.
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
add a comment |Â
up vote
30
down vote
accepted
The motivation goes like this.
- When we define things mathematically, we want to use as few separate objects as possible. We don't want to define a new object independently if it can be defined in terms of existing things.
- Suppose a particle moves so that when it is at position $mathbfr$, its velocity is $mathbfv(mathbfr)$, where $mathbfv$ is a vector field. Then if there is some function $f(mathbfr)$, then the particle sees
$$fracdfdt = v^i fracpartial fpartial x^i$$
by the chain rule. That is, if we interpret a vector field as a velocity field, and give it a function $f$, then we can compute another function $df/dt$, which is the rate of change of $f$ seen by a particle following the flow of the vector field, as if it were a velocity field. - By glancing at the chain rule, you see that if you know $df/dt$ for every $f$, then you know what the vector field is.
- Hence, when we work in the more general setting of a manifold, where it's not immediately clear how to define a vector field in the usual way ("an arrow at every point"), we can use this in reverse to define what a vector field is. That is, a vector field $v$ is a map of functions $f mapsto v(f)$ which obeys certain properties.
- Note that not every vector field should be physically regarded as a velocity field. We're just making mathematical definitions here. The definitions are chosen to make the formalism as clean and simple as possible, possibly at the expense of intuition.
- Translating between the two is very simple. For example, the field
$$mathbfE(mathbfr) = x hati + xy hatj$$
translates to
$$mathbfE(mathbfr) = x fracpartialpartial x + x y fracpartialpartial y.$$
That is, whenever you see $partial/partial x$ you can just imagine it as a unit vector pointing in the $x$ direction. The intuition is the same, because the two definitions obey the same properties.
There are many examples of this in mathematics. For example, you might think $log(x)$ is defined as "the number of times you have to multiple by $e$ to get to $x$", but it's unclear how to rigorously define that. Then, through semi-rigorous manipulation you can show that
$$log(x) = int_1^x fracdx'x'.$$
Now the mathematician sees this and chooses to define $log(x)$ as this integral. This is simpler, because it automatically works for any real $x$ and uses only the notion of an integral, which we already know. Then one can derive the "intuitive" properties of the logarithm, such as $log(xy) = log(x) + log(y)$. By using a less intuitive definition, the formalism becomes simpler. And once you show that this definition is equivalent to the intuitive one, you are "allowed" to just keep on using the same intuition you started with, so you get the best of both worlds!
Edit: the OP asks for examples where this definition of a vector field is more practically useful. I can think of two off the top of my head. First, how do basis vectors transform when you change coordinates from $x^i$ to $y^j$? In the usual formalism you may have to memorize a formula, but with the derivatives it follows from the chain rule,
$$fracpartialpartial x^i = fracpartial y^jpartial x^i fracpartialpartial y^j equiv J^j_i fracpartialpartial y^j$$
where $J$ is the Jacobian matrix. Next, suppose the vector field actually is a velocity field, and you want to calculate $f(x(t))$ given $x(0)$, i.e. you want to know where you'll be if you follow the flow for time $t$. In this formalism, that's a one-liner. It's just
$$f(x(t)) = (e^t v f(x))|_x = x(0).$$
To prove this, expand the exponential in a Taylor series.
Of course, the great thing is that once you prove two formalisms are equivalent, you can use the intuition from either one interchangably, because you know they're both equally valid. So you gain intuition for a few new cases, without losing any intuition you had before. You can always change back and forth, much like the same software program can run on different hardware.
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
add a comment |Â
up vote
30
down vote
accepted
up vote
30
down vote
accepted
The motivation goes like this.
- When we define things mathematically, we want to use as few separate objects as possible. We don't want to define a new object independently if it can be defined in terms of existing things.
- Suppose a particle moves so that when it is at position $mathbfr$, its velocity is $mathbfv(mathbfr)$, where $mathbfv$ is a vector field. Then if there is some function $f(mathbfr)$, then the particle sees
$$fracdfdt = v^i fracpartial fpartial x^i$$
by the chain rule. That is, if we interpret a vector field as a velocity field, and give it a function $f$, then we can compute another function $df/dt$, which is the rate of change of $f$ seen by a particle following the flow of the vector field, as if it were a velocity field. - By glancing at the chain rule, you see that if you know $df/dt$ for every $f$, then you know what the vector field is.
- Hence, when we work in the more general setting of a manifold, where it's not immediately clear how to define a vector field in the usual way ("an arrow at every point"), we can use this in reverse to define what a vector field is. That is, a vector field $v$ is a map of functions $f mapsto v(f)$ which obeys certain properties.
- Note that not every vector field should be physically regarded as a velocity field. We're just making mathematical definitions here. The definitions are chosen to make the formalism as clean and simple as possible, possibly at the expense of intuition.
- Translating between the two is very simple. For example, the field
$$mathbfE(mathbfr) = x hati + xy hatj$$
translates to
$$mathbfE(mathbfr) = x fracpartialpartial x + x y fracpartialpartial y.$$
That is, whenever you see $partial/partial x$ you can just imagine it as a unit vector pointing in the $x$ direction. The intuition is the same, because the two definitions obey the same properties.
There are many examples of this in mathematics. For example, you might think $log(x)$ is defined as "the number of times you have to multiple by $e$ to get to $x$", but it's unclear how to rigorously define that. Then, through semi-rigorous manipulation you can show that
$$log(x) = int_1^x fracdx'x'.$$
Now the mathematician sees this and chooses to define $log(x)$ as this integral. This is simpler, because it automatically works for any real $x$ and uses only the notion of an integral, which we already know. Then one can derive the "intuitive" properties of the logarithm, such as $log(xy) = log(x) + log(y)$. By using a less intuitive definition, the formalism becomes simpler. And once you show that this definition is equivalent to the intuitive one, you are "allowed" to just keep on using the same intuition you started with, so you get the best of both worlds!
Edit: the OP asks for examples where this definition of a vector field is more practically useful. I can think of two off the top of my head. First, how do basis vectors transform when you change coordinates from $x^i$ to $y^j$? In the usual formalism you may have to memorize a formula, but with the derivatives it follows from the chain rule,
$$fracpartialpartial x^i = fracpartial y^jpartial x^i fracpartialpartial y^j equiv J^j_i fracpartialpartial y^j$$
where $J$ is the Jacobian matrix. Next, suppose the vector field actually is a velocity field, and you want to calculate $f(x(t))$ given $x(0)$, i.e. you want to know where you'll be if you follow the flow for time $t$. In this formalism, that's a one-liner. It's just
$$f(x(t)) = (e^t v f(x))|_x = x(0).$$
To prove this, expand the exponential in a Taylor series.
Of course, the great thing is that once you prove two formalisms are equivalent, you can use the intuition from either one interchangably, because you know they're both equally valid. So you gain intuition for a few new cases, without losing any intuition you had before. You can always change back and forth, much like the same software program can run on different hardware.
The motivation goes like this.
- When we define things mathematically, we want to use as few separate objects as possible. We don't want to define a new object independently if it can be defined in terms of existing things.
- Suppose a particle moves so that when it is at position $mathbfr$, its velocity is $mathbfv(mathbfr)$, where $mathbfv$ is a vector field. Then if there is some function $f(mathbfr)$, then the particle sees
$$fracdfdt = v^i fracpartial fpartial x^i$$
by the chain rule. That is, if we interpret a vector field as a velocity field, and give it a function $f$, then we can compute another function $df/dt$, which is the rate of change of $f$ seen by a particle following the flow of the vector field, as if it were a velocity field. - By glancing at the chain rule, you see that if you know $df/dt$ for every $f$, then you know what the vector field is.
- Hence, when we work in the more general setting of a manifold, where it's not immediately clear how to define a vector field in the usual way ("an arrow at every point"), we can use this in reverse to define what a vector field is. That is, a vector field $v$ is a map of functions $f mapsto v(f)$ which obeys certain properties.
- Note that not every vector field should be physically regarded as a velocity field. We're just making mathematical definitions here. The definitions are chosen to make the formalism as clean and simple as possible, possibly at the expense of intuition.
- Translating between the two is very simple. For example, the field
$$mathbfE(mathbfr) = x hati + xy hatj$$
translates to
$$mathbfE(mathbfr) = x fracpartialpartial x + x y fracpartialpartial y.$$
That is, whenever you see $partial/partial x$ you can just imagine it as a unit vector pointing in the $x$ direction. The intuition is the same, because the two definitions obey the same properties.
There are many examples of this in mathematics. For example, you might think $log(x)$ is defined as "the number of times you have to multiple by $e$ to get to $x$", but it's unclear how to rigorously define that. Then, through semi-rigorous manipulation you can show that
$$log(x) = int_1^x fracdx'x'.$$
Now the mathematician sees this and chooses to define $log(x)$ as this integral. This is simpler, because it automatically works for any real $x$ and uses only the notion of an integral, which we already know. Then one can derive the "intuitive" properties of the logarithm, such as $log(xy) = log(x) + log(y)$. By using a less intuitive definition, the formalism becomes simpler. And once you show that this definition is equivalent to the intuitive one, you are "allowed" to just keep on using the same intuition you started with, so you get the best of both worlds!
Edit: the OP asks for examples where this definition of a vector field is more practically useful. I can think of two off the top of my head. First, how do basis vectors transform when you change coordinates from $x^i$ to $y^j$? In the usual formalism you may have to memorize a formula, but with the derivatives it follows from the chain rule,
$$fracpartialpartial x^i = fracpartial y^jpartial x^i fracpartialpartial y^j equiv J^j_i fracpartialpartial y^j$$
where $J$ is the Jacobian matrix. Next, suppose the vector field actually is a velocity field, and you want to calculate $f(x(t))$ given $x(0)$, i.e. you want to know where you'll be if you follow the flow for time $t$. In this formalism, that's a one-liner. It's just
$$f(x(t)) = (e^t v f(x))|_x = x(0).$$
To prove this, expand the exponential in a Taylor series.
Of course, the great thing is that once you prove two formalisms are equivalent, you can use the intuition from either one interchangably, because you know they're both equally valid. So you gain intuition for a few new cases, without losing any intuition you had before. You can always change back and forth, much like the same software program can run on different hardware.
edited Aug 14 at 11:18
answered Aug 13 at 17:49
knzhou
33.9k897170
33.9k897170
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
add a comment |Â
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
Thank you very much for the answer! I think I get why it is useful to have such a definition (at least from a mathematical point of view), but I still struggle to see the physical meaning of which Peter Dalakov spoke in his answer (âÂÂdirectional derivativeâÂÂ). Is there a scenario where it is more useful to use the definition with partial derivatives than the âÂÂstandardâ one?
â Sito
Aug 14 at 10:40
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
@Sito Sure, I edited with two examples.
â knzhou
Aug 14 at 11:18
add a comment |Â
up vote
4
down vote
A one-line motivation is as follows:
You can identify a vector (field) with the "directional derivative" along that vector (field).
Given a point and a vector at that point, you can (try to) differentiate a function at that point in that direction.
In coordinates, the relation between your $X$ and your $vecA=sum_i=1^n A_i vece_i$ is
$$ X= vecA cdot nabla = sum_i=1^n A_i fracpartialpartial x_i.$$
A vector (field) is a "dynamical" object, a way of "transforming space": take the ODE, associated with $vecA$, i.e. $boldsymbolr'= vecA(boldsymbolr)$, and look at the flow. Once you have the flow, you can differentiate your functions along it.
add a comment |Â
up vote
4
down vote
A one-line motivation is as follows:
You can identify a vector (field) with the "directional derivative" along that vector (field).
Given a point and a vector at that point, you can (try to) differentiate a function at that point in that direction.
In coordinates, the relation between your $X$ and your $vecA=sum_i=1^n A_i vece_i$ is
$$ X= vecA cdot nabla = sum_i=1^n A_i fracpartialpartial x_i.$$
A vector (field) is a "dynamical" object, a way of "transforming space": take the ODE, associated with $vecA$, i.e. $boldsymbolr'= vecA(boldsymbolr)$, and look at the flow. Once you have the flow, you can differentiate your functions along it.
add a comment |Â
up vote
4
down vote
up vote
4
down vote
A one-line motivation is as follows:
You can identify a vector (field) with the "directional derivative" along that vector (field).
Given a point and a vector at that point, you can (try to) differentiate a function at that point in that direction.
In coordinates, the relation between your $X$ and your $vecA=sum_i=1^n A_i vece_i$ is
$$ X= vecA cdot nabla = sum_i=1^n A_i fracpartialpartial x_i.$$
A vector (field) is a "dynamical" object, a way of "transforming space": take the ODE, associated with $vecA$, i.e. $boldsymbolr'= vecA(boldsymbolr)$, and look at the flow. Once you have the flow, you can differentiate your functions along it.
A one-line motivation is as follows:
You can identify a vector (field) with the "directional derivative" along that vector (field).
Given a point and a vector at that point, you can (try to) differentiate a function at that point in that direction.
In coordinates, the relation between your $X$ and your $vecA=sum_i=1^n A_i vece_i$ is
$$ X= vecA cdot nabla = sum_i=1^n A_i fracpartialpartial x_i.$$
A vector (field) is a "dynamical" object, a way of "transforming space": take the ODE, associated with $vecA$, i.e. $boldsymbolr'= vecA(boldsymbolr)$, and look at the flow. Once you have the flow, you can differentiate your functions along it.
answered Aug 14 at 8:47
Peter Dalakov
1411
1411
add a comment |Â
add a comment |Â
up vote
1
down vote
This is really a mathematics question and not a physics question.
It's a question of how we think about the derivative. There is more than one way. And this is often how progress is made, by making progress on more than one front.
Newton had a dynamic conception of the rate of change which he mathematically implemented through an intuitive notion of infinitesimals - aka his method of fluxions. By this method the notion of a derivation is a derived concept.
This is a natural concept to come up with. After all, once we have the notion of a derivative to hand, a natural question to ask is their a simple relationship between the derivative of a product and the derivatives of the factors of a product. There is, ie:
$d(gf) = dg.f + f.dg$
When formalised, this is the notion of a derivation.
Mathematically speaking, we think of the derivative as defined by a limit - taking our cue from Cauchy.
However, one can take the derived concept of the derivation as another point of departure for the calculus and this gives us an algebraic way of thinking about derivatives. And this gives a cleaner development of the main tools of the calculus, at least in the finite-dimensional context.
By the way, the usual definition of a vector field is as section of a vector bundle. This simply means that to every point of manifold a vector is attached in a continuous fashion.
add a comment |Â
up vote
1
down vote
This is really a mathematics question and not a physics question.
It's a question of how we think about the derivative. There is more than one way. And this is often how progress is made, by making progress on more than one front.
Newton had a dynamic conception of the rate of change which he mathematically implemented through an intuitive notion of infinitesimals - aka his method of fluxions. By this method the notion of a derivation is a derived concept.
This is a natural concept to come up with. After all, once we have the notion of a derivative to hand, a natural question to ask is their a simple relationship between the derivative of a product and the derivatives of the factors of a product. There is, ie:
$d(gf) = dg.f + f.dg$
When formalised, this is the notion of a derivation.
Mathematically speaking, we think of the derivative as defined by a limit - taking our cue from Cauchy.
However, one can take the derived concept of the derivation as another point of departure for the calculus and this gives us an algebraic way of thinking about derivatives. And this gives a cleaner development of the main tools of the calculus, at least in the finite-dimensional context.
By the way, the usual definition of a vector field is as section of a vector bundle. This simply means that to every point of manifold a vector is attached in a continuous fashion.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
This is really a mathematics question and not a physics question.
It's a question of how we think about the derivative. There is more than one way. And this is often how progress is made, by making progress on more than one front.
Newton had a dynamic conception of the rate of change which he mathematically implemented through an intuitive notion of infinitesimals - aka his method of fluxions. By this method the notion of a derivation is a derived concept.
This is a natural concept to come up with. After all, once we have the notion of a derivative to hand, a natural question to ask is their a simple relationship between the derivative of a product and the derivatives of the factors of a product. There is, ie:
$d(gf) = dg.f + f.dg$
When formalised, this is the notion of a derivation.
Mathematically speaking, we think of the derivative as defined by a limit - taking our cue from Cauchy.
However, one can take the derived concept of the derivation as another point of departure for the calculus and this gives us an algebraic way of thinking about derivatives. And this gives a cleaner development of the main tools of the calculus, at least in the finite-dimensional context.
By the way, the usual definition of a vector field is as section of a vector bundle. This simply means that to every point of manifold a vector is attached in a continuous fashion.
This is really a mathematics question and not a physics question.
It's a question of how we think about the derivative. There is more than one way. And this is often how progress is made, by making progress on more than one front.
Newton had a dynamic conception of the rate of change which he mathematically implemented through an intuitive notion of infinitesimals - aka his method of fluxions. By this method the notion of a derivation is a derived concept.
This is a natural concept to come up with. After all, once we have the notion of a derivative to hand, a natural question to ask is their a simple relationship between the derivative of a product and the derivatives of the factors of a product. There is, ie:
$d(gf) = dg.f + f.dg$
When formalised, this is the notion of a derivation.
Mathematically speaking, we think of the derivative as defined by a limit - taking our cue from Cauchy.
However, one can take the derived concept of the derivation as another point of departure for the calculus and this gives us an algebraic way of thinking about derivatives. And this gives a cleaner development of the main tools of the calculus, at least in the finite-dimensional context.
By the way, the usual definition of a vector field is as section of a vector bundle. This simply means that to every point of manifold a vector is attached in a continuous fashion.
edited Aug 14 at 10:03
answered Aug 14 at 9:57
Mozibur Ullah
4,43122144
4,43122144
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f422570%2fwhy-would-one-write-a-vector-field-as-a-derivative%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
7
Would Mathematics be a better home for this question?
â Qmechanicâ¦
Aug 13 at 17:39
Quite similar question here on Math.SE. Note that this representation of a vector field is not "as a derivative", but "as a differential operator".
â Henning Makholm
Aug 13 at 23:09
1
related/possible duplicate: Inconsistency with partial derivatives as basis vectors?.
â AccidentalFourierTransform
Aug 13 at 23:17