If $A^mu$ is not determined uniquely by Maxwell's equations, what happens if we solve for it numerically?
Clash Royale CLAN TAG#URR8PPP
up vote
7
down vote
favorite
Given a solution $A^mu(x)$ to Maxwell's equations
beginequation
Box A^mu(x)-partial^mupartial_nuA^nu=0tag1
endequation
which also satisfies some specified initial conditions at time $t_0$
beginequation
A^mu(vecx,t_0)=f^mu(vecx),quad dotA^mu(vecx,t_0)=g^mu(vecx)tag2
endequation
we have that the function
beginequation
A^'mu(x)=A^mu(x)+partial^mualpha(x)tag3
endequation
also satisfies the equations of motion, and if we arrange that the scalar function $alpha$ also satisfy that
beginequation
partial^mualpha(vecx,t_0)=0,quad partial^mudotalpha(vecx,t_0)=0 tag4
endequation
at the initial time $t_0$, then the new solution $A^'mu$ also satisfies the initial conditions. For example, the function
beginequation
alpha(vecx,t)=(t-t_0)^5h(vecx)e^-(t-t_0)^2
endequation
satisfies the conditions Eq.$(4)$ and also vanishes at $trightarrow pm infty$. Therefore, the solution to Eq.$(1)$ is not uniquely determined by the initial data Eq.$(2)$.
Question: If one simulates Eq.$(1)$ numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.$(2)$?
electromagnetism gauge-theory maxwell-equations boundary-conditions determinism
|
show 1 more comment
up vote
7
down vote
favorite
Given a solution $A^mu(x)$ to Maxwell's equations
beginequation
Box A^mu(x)-partial^mupartial_nuA^nu=0tag1
endequation
which also satisfies some specified initial conditions at time $t_0$
beginequation
A^mu(vecx,t_0)=f^mu(vecx),quad dotA^mu(vecx,t_0)=g^mu(vecx)tag2
endequation
we have that the function
beginequation
A^'mu(x)=A^mu(x)+partial^mualpha(x)tag3
endequation
also satisfies the equations of motion, and if we arrange that the scalar function $alpha$ also satisfy that
beginequation
partial^mualpha(vecx,t_0)=0,quad partial^mudotalpha(vecx,t_0)=0 tag4
endequation
at the initial time $t_0$, then the new solution $A^'mu$ also satisfies the initial conditions. For example, the function
beginequation
alpha(vecx,t)=(t-t_0)^5h(vecx)e^-(t-t_0)^2
endequation
satisfies the conditions Eq.$(4)$ and also vanishes at $trightarrow pm infty$. Therefore, the solution to Eq.$(1)$ is not uniquely determined by the initial data Eq.$(2)$.
Question: If one simulates Eq.$(1)$ numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.$(2)$?
electromagnetism gauge-theory maxwell-equations boundary-conditions determinism
3
Try and simulate it yourself. Spoiler alert: you won't be able to, at least not without fixing the gauge first. Numerically solving a PDE requires, for example, inverting a matrix/solving a linear system. This doesn't work when you have gauge invariance, because the matrix is singular.
– AccidentalFourierTransform
Nov 17 at 0:04
2
@AccidentalFourierTransform This isn't quite true. Your numerics may or not converge to a solution, depending on the algorithm. Some techniques involve solving a linear system, and they'll fail, but many techniques will e.g. trivially converge to the solution $alpha equiv 0$. The issue is non-uniqueness, not non-existence.
– tparker
Nov 17 at 2:49
@tparker I never said anything about non-existence. A linear system with singular matrix has an infinite number of solutions. So we agree the issue is about non-uniqueness, not about non-existence.
– AccidentalFourierTransform
Nov 17 at 2:53
2
Related: physics.stackexchange.com/q/20071/2451
– Qmechanic♦
Nov 17 at 3:37
1
If one simulates Eq.(1) numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.(2)? I don't think the assumption is true. Even though the solution is non-unique, your algorithm can converge to a particular solution. Take the ordinary equation $x^ 2=1$. If you apply the bisection method in the interval $[0,2]$, you find the solution $x=1$, although you miss $x=-1$. Other methods might not converge. So I think that without specifying a particular numerical method, answers are going to be very vague.
– jinawee
2 days ago
|
show 1 more comment
up vote
7
down vote
favorite
up vote
7
down vote
favorite
Given a solution $A^mu(x)$ to Maxwell's equations
beginequation
Box A^mu(x)-partial^mupartial_nuA^nu=0tag1
endequation
which also satisfies some specified initial conditions at time $t_0$
beginequation
A^mu(vecx,t_0)=f^mu(vecx),quad dotA^mu(vecx,t_0)=g^mu(vecx)tag2
endequation
we have that the function
beginequation
A^'mu(x)=A^mu(x)+partial^mualpha(x)tag3
endequation
also satisfies the equations of motion, and if we arrange that the scalar function $alpha$ also satisfy that
beginequation
partial^mualpha(vecx,t_0)=0,quad partial^mudotalpha(vecx,t_0)=0 tag4
endequation
at the initial time $t_0$, then the new solution $A^'mu$ also satisfies the initial conditions. For example, the function
beginequation
alpha(vecx,t)=(t-t_0)^5h(vecx)e^-(t-t_0)^2
endequation
satisfies the conditions Eq.$(4)$ and also vanishes at $trightarrow pm infty$. Therefore, the solution to Eq.$(1)$ is not uniquely determined by the initial data Eq.$(2)$.
Question: If one simulates Eq.$(1)$ numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.$(2)$?
electromagnetism gauge-theory maxwell-equations boundary-conditions determinism
Given a solution $A^mu(x)$ to Maxwell's equations
beginequation
Box A^mu(x)-partial^mupartial_nuA^nu=0tag1
endequation
which also satisfies some specified initial conditions at time $t_0$
beginequation
A^mu(vecx,t_0)=f^mu(vecx),quad dotA^mu(vecx,t_0)=g^mu(vecx)tag2
endequation
we have that the function
beginequation
A^'mu(x)=A^mu(x)+partial^mualpha(x)tag3
endequation
also satisfies the equations of motion, and if we arrange that the scalar function $alpha$ also satisfy that
beginequation
partial^mualpha(vecx,t_0)=0,quad partial^mudotalpha(vecx,t_0)=0 tag4
endequation
at the initial time $t_0$, then the new solution $A^'mu$ also satisfies the initial conditions. For example, the function
beginequation
alpha(vecx,t)=(t-t_0)^5h(vecx)e^-(t-t_0)^2
endequation
satisfies the conditions Eq.$(4)$ and also vanishes at $trightarrow pm infty$. Therefore, the solution to Eq.$(1)$ is not uniquely determined by the initial data Eq.$(2)$.
Question: If one simulates Eq.$(1)$ numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.$(2)$?
electromagnetism gauge-theory maxwell-equations boundary-conditions determinism
electromagnetism gauge-theory maxwell-equations boundary-conditions determinism
edited 2 days ago
knzhou
37.5k9104182
37.5k9104182
asked Nov 16 at 23:54
Luke
530411
530411
3
Try and simulate it yourself. Spoiler alert: you won't be able to, at least not without fixing the gauge first. Numerically solving a PDE requires, for example, inverting a matrix/solving a linear system. This doesn't work when you have gauge invariance, because the matrix is singular.
– AccidentalFourierTransform
Nov 17 at 0:04
2
@AccidentalFourierTransform This isn't quite true. Your numerics may or not converge to a solution, depending on the algorithm. Some techniques involve solving a linear system, and they'll fail, but many techniques will e.g. trivially converge to the solution $alpha equiv 0$. The issue is non-uniqueness, not non-existence.
– tparker
Nov 17 at 2:49
@tparker I never said anything about non-existence. A linear system with singular matrix has an infinite number of solutions. So we agree the issue is about non-uniqueness, not about non-existence.
– AccidentalFourierTransform
Nov 17 at 2:53
2
Related: physics.stackexchange.com/q/20071/2451
– Qmechanic♦
Nov 17 at 3:37
1
If one simulates Eq.(1) numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.(2)? I don't think the assumption is true. Even though the solution is non-unique, your algorithm can converge to a particular solution. Take the ordinary equation $x^ 2=1$. If you apply the bisection method in the interval $[0,2]$, you find the solution $x=1$, although you miss $x=-1$. Other methods might not converge. So I think that without specifying a particular numerical method, answers are going to be very vague.
– jinawee
2 days ago
|
show 1 more comment
3
Try and simulate it yourself. Spoiler alert: you won't be able to, at least not without fixing the gauge first. Numerically solving a PDE requires, for example, inverting a matrix/solving a linear system. This doesn't work when you have gauge invariance, because the matrix is singular.
– AccidentalFourierTransform
Nov 17 at 0:04
2
@AccidentalFourierTransform This isn't quite true. Your numerics may or not converge to a solution, depending on the algorithm. Some techniques involve solving a linear system, and they'll fail, but many techniques will e.g. trivially converge to the solution $alpha equiv 0$. The issue is non-uniqueness, not non-existence.
– tparker
Nov 17 at 2:49
@tparker I never said anything about non-existence. A linear system with singular matrix has an infinite number of solutions. So we agree the issue is about non-uniqueness, not about non-existence.
– AccidentalFourierTransform
Nov 17 at 2:53
2
Related: physics.stackexchange.com/q/20071/2451
– Qmechanic♦
Nov 17 at 3:37
1
If one simulates Eq.(1) numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.(2)? I don't think the assumption is true. Even though the solution is non-unique, your algorithm can converge to a particular solution. Take the ordinary equation $x^ 2=1$. If you apply the bisection method in the interval $[0,2]$, you find the solution $x=1$, although you miss $x=-1$. Other methods might not converge. So I think that without specifying a particular numerical method, answers are going to be very vague.
– jinawee
2 days ago
3
3
Try and simulate it yourself. Spoiler alert: you won't be able to, at least not without fixing the gauge first. Numerically solving a PDE requires, for example, inverting a matrix/solving a linear system. This doesn't work when you have gauge invariance, because the matrix is singular.
– AccidentalFourierTransform
Nov 17 at 0:04
Try and simulate it yourself. Spoiler alert: you won't be able to, at least not without fixing the gauge first. Numerically solving a PDE requires, for example, inverting a matrix/solving a linear system. This doesn't work when you have gauge invariance, because the matrix is singular.
– AccidentalFourierTransform
Nov 17 at 0:04
2
2
@AccidentalFourierTransform This isn't quite true. Your numerics may or not converge to a solution, depending on the algorithm. Some techniques involve solving a linear system, and they'll fail, but many techniques will e.g. trivially converge to the solution $alpha equiv 0$. The issue is non-uniqueness, not non-existence.
– tparker
Nov 17 at 2:49
@AccidentalFourierTransform This isn't quite true. Your numerics may or not converge to a solution, depending on the algorithm. Some techniques involve solving a linear system, and they'll fail, but many techniques will e.g. trivially converge to the solution $alpha equiv 0$. The issue is non-uniqueness, not non-existence.
– tparker
Nov 17 at 2:49
@tparker I never said anything about non-existence. A linear system with singular matrix has an infinite number of solutions. So we agree the issue is about non-uniqueness, not about non-existence.
– AccidentalFourierTransform
Nov 17 at 2:53
@tparker I never said anything about non-existence. A linear system with singular matrix has an infinite number of solutions. So we agree the issue is about non-uniqueness, not about non-existence.
– AccidentalFourierTransform
Nov 17 at 2:53
2
2
Related: physics.stackexchange.com/q/20071/2451
– Qmechanic♦
Nov 17 at 3:37
Related: physics.stackexchange.com/q/20071/2451
– Qmechanic♦
Nov 17 at 3:37
1
1
If one simulates Eq.(1) numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.(2)? I don't think the assumption is true. Even though the solution is non-unique, your algorithm can converge to a particular solution. Take the ordinary equation $x^ 2=1$. If you apply the bisection method in the interval $[0,2]$, you find the solution $x=1$, although you miss $x=-1$. Other methods might not converge. So I think that without specifying a particular numerical method, answers are going to be very vague.
– jinawee
2 days ago
If one simulates Eq.(1) numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.(2)? I don't think the assumption is true. Even though the solution is non-unique, your algorithm can converge to a particular solution. Take the ordinary equation $x^ 2=1$. If you apply the bisection method in the interval $[0,2]$, you find the solution $x=1$, although you miss $x=-1$. Other methods might not converge. So I think that without specifying a particular numerical method, answers are going to be very vague.
– jinawee
2 days ago
|
show 1 more comment
2 Answers
2
active
oldest
votes
up vote
3
down vote
Not all initial value problems have unique solution. Your example of $alpha$ function demonstrates that this initial value problem is of such kind.
In this case, the problem is in the system of partial differential equations
$$
partial_nupartial^nu A^mu-partial^mupartial_nuA^nu=0
$$
itself; it does not put enough constraint on the functions $varphi(mathbf x,t), mathbf A(mathbf x,t)$. It is somewhat similar to a situation in linear algebra that sometimes occurs where a system of $n$ linear equations for $n$ unknowns has infinity of solutions.
A slightly different way to see this: notice that nowhere in the above system of PDE can we find $partial_t^2 A^0$ or $partial_t A^0$ directly; only a spatial gradient of $partial_t A^0$ is present. The equations for $A^i$'s do not relate them directly with time derivatives of $varphi$.
This means that if we have a solution of the initial value problem $varphi(x,t),mathbf A(x,t)$ and replace the scalar potential by $varphi' = varphi(x,t)+ht^2$ at time $t = 0$ (where $h$ is a constant), the equations are still satisfied and at $t=0$, initial conditions are satisfied too. This would not be so obviously possible if the system contained directly time derivatives of $varphi$. Consider a slightly different system
$$
partial_nupartial^nu A^mu= 0,
$$
(which in EM theory can be derived as a result of the Lorenz gauge choice) - this does constrain $partial_t^2 varphi$, so the above argument fails. I think this system should have a unique solution, because it is very similar to a set of equations for independent harmonic oscillators. However, for proof better check with mathematicians.
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
add a comment |
up vote
2
down vote
Are you asking for the physical or mathematical explanation? Dan Yand's answer gives the physical explanation.
Regarding the mathematical question: On what basis would you expect the field configuration to be uniquely determined by its initial data? Unlike for (uncoupled) ODE's, there's no theorem to that effect for general linear homogeneous second-order PDEs.
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
|
show 4 more comments
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
Not all initial value problems have unique solution. Your example of $alpha$ function demonstrates that this initial value problem is of such kind.
In this case, the problem is in the system of partial differential equations
$$
partial_nupartial^nu A^mu-partial^mupartial_nuA^nu=0
$$
itself; it does not put enough constraint on the functions $varphi(mathbf x,t), mathbf A(mathbf x,t)$. It is somewhat similar to a situation in linear algebra that sometimes occurs where a system of $n$ linear equations for $n$ unknowns has infinity of solutions.
A slightly different way to see this: notice that nowhere in the above system of PDE can we find $partial_t^2 A^0$ or $partial_t A^0$ directly; only a spatial gradient of $partial_t A^0$ is present. The equations for $A^i$'s do not relate them directly with time derivatives of $varphi$.
This means that if we have a solution of the initial value problem $varphi(x,t),mathbf A(x,t)$ and replace the scalar potential by $varphi' = varphi(x,t)+ht^2$ at time $t = 0$ (where $h$ is a constant), the equations are still satisfied and at $t=0$, initial conditions are satisfied too. This would not be so obviously possible if the system contained directly time derivatives of $varphi$. Consider a slightly different system
$$
partial_nupartial^nu A^mu= 0,
$$
(which in EM theory can be derived as a result of the Lorenz gauge choice) - this does constrain $partial_t^2 varphi$, so the above argument fails. I think this system should have a unique solution, because it is very similar to a set of equations for independent harmonic oscillators. However, for proof better check with mathematicians.
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
add a comment |
up vote
3
down vote
Not all initial value problems have unique solution. Your example of $alpha$ function demonstrates that this initial value problem is of such kind.
In this case, the problem is in the system of partial differential equations
$$
partial_nupartial^nu A^mu-partial^mupartial_nuA^nu=0
$$
itself; it does not put enough constraint on the functions $varphi(mathbf x,t), mathbf A(mathbf x,t)$. It is somewhat similar to a situation in linear algebra that sometimes occurs where a system of $n$ linear equations for $n$ unknowns has infinity of solutions.
A slightly different way to see this: notice that nowhere in the above system of PDE can we find $partial_t^2 A^0$ or $partial_t A^0$ directly; only a spatial gradient of $partial_t A^0$ is present. The equations for $A^i$'s do not relate them directly with time derivatives of $varphi$.
This means that if we have a solution of the initial value problem $varphi(x,t),mathbf A(x,t)$ and replace the scalar potential by $varphi' = varphi(x,t)+ht^2$ at time $t = 0$ (where $h$ is a constant), the equations are still satisfied and at $t=0$, initial conditions are satisfied too. This would not be so obviously possible if the system contained directly time derivatives of $varphi$. Consider a slightly different system
$$
partial_nupartial^nu A^mu= 0,
$$
(which in EM theory can be derived as a result of the Lorenz gauge choice) - this does constrain $partial_t^2 varphi$, so the above argument fails. I think this system should have a unique solution, because it is very similar to a set of equations for independent harmonic oscillators. However, for proof better check with mathematicians.
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
add a comment |
up vote
3
down vote
up vote
3
down vote
Not all initial value problems have unique solution. Your example of $alpha$ function demonstrates that this initial value problem is of such kind.
In this case, the problem is in the system of partial differential equations
$$
partial_nupartial^nu A^mu-partial^mupartial_nuA^nu=0
$$
itself; it does not put enough constraint on the functions $varphi(mathbf x,t), mathbf A(mathbf x,t)$. It is somewhat similar to a situation in linear algebra that sometimes occurs where a system of $n$ linear equations for $n$ unknowns has infinity of solutions.
A slightly different way to see this: notice that nowhere in the above system of PDE can we find $partial_t^2 A^0$ or $partial_t A^0$ directly; only a spatial gradient of $partial_t A^0$ is present. The equations for $A^i$'s do not relate them directly with time derivatives of $varphi$.
This means that if we have a solution of the initial value problem $varphi(x,t),mathbf A(x,t)$ and replace the scalar potential by $varphi' = varphi(x,t)+ht^2$ at time $t = 0$ (where $h$ is a constant), the equations are still satisfied and at $t=0$, initial conditions are satisfied too. This would not be so obviously possible if the system contained directly time derivatives of $varphi$. Consider a slightly different system
$$
partial_nupartial^nu A^mu= 0,
$$
(which in EM theory can be derived as a result of the Lorenz gauge choice) - this does constrain $partial_t^2 varphi$, so the above argument fails. I think this system should have a unique solution, because it is very similar to a set of equations for independent harmonic oscillators. However, for proof better check with mathematicians.
Not all initial value problems have unique solution. Your example of $alpha$ function demonstrates that this initial value problem is of such kind.
In this case, the problem is in the system of partial differential equations
$$
partial_nupartial^nu A^mu-partial^mupartial_nuA^nu=0
$$
itself; it does not put enough constraint on the functions $varphi(mathbf x,t), mathbf A(mathbf x,t)$. It is somewhat similar to a situation in linear algebra that sometimes occurs where a system of $n$ linear equations for $n$ unknowns has infinity of solutions.
A slightly different way to see this: notice that nowhere in the above system of PDE can we find $partial_t^2 A^0$ or $partial_t A^0$ directly; only a spatial gradient of $partial_t A^0$ is present. The equations for $A^i$'s do not relate them directly with time derivatives of $varphi$.
This means that if we have a solution of the initial value problem $varphi(x,t),mathbf A(x,t)$ and replace the scalar potential by $varphi' = varphi(x,t)+ht^2$ at time $t = 0$ (where $h$ is a constant), the equations are still satisfied and at $t=0$, initial conditions are satisfied too. This would not be so obviously possible if the system contained directly time derivatives of $varphi$. Consider a slightly different system
$$
partial_nupartial^nu A^mu= 0,
$$
(which in EM theory can be derived as a result of the Lorenz gauge choice) - this does constrain $partial_t^2 varphi$, so the above argument fails. I think this system should have a unique solution, because it is very similar to a set of equations for independent harmonic oscillators. However, for proof better check with mathematicians.
edited 2 days ago
answered Nov 17 at 2:14
Ján Lalinský
13.7k1334
13.7k1334
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
add a comment |
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
@tparker, thanks for pointing this out, I missed a preposition, fixed.
– Ján Lalinský
2 days ago
add a comment |
up vote
2
down vote
Are you asking for the physical or mathematical explanation? Dan Yand's answer gives the physical explanation.
Regarding the mathematical question: On what basis would you expect the field configuration to be uniquely determined by its initial data? Unlike for (uncoupled) ODE's, there's no theorem to that effect for general linear homogeneous second-order PDEs.
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
|
show 4 more comments
up vote
2
down vote
Are you asking for the physical or mathematical explanation? Dan Yand's answer gives the physical explanation.
Regarding the mathematical question: On what basis would you expect the field configuration to be uniquely determined by its initial data? Unlike for (uncoupled) ODE's, there's no theorem to that effect for general linear homogeneous second-order PDEs.
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
|
show 4 more comments
up vote
2
down vote
up vote
2
down vote
Are you asking for the physical or mathematical explanation? Dan Yand's answer gives the physical explanation.
Regarding the mathematical question: On what basis would you expect the field configuration to be uniquely determined by its initial data? Unlike for (uncoupled) ODE's, there's no theorem to that effect for general linear homogeneous second-order PDEs.
Are you asking for the physical or mathematical explanation? Dan Yand's answer gives the physical explanation.
Regarding the mathematical question: On what basis would you expect the field configuration to be uniquely determined by its initial data? Unlike for (uncoupled) ODE's, there's no theorem to that effect for general linear homogeneous second-order PDEs.
edited Nov 17 at 2:56
answered Nov 17 at 0:53
tparker
22.1k146117
22.1k146117
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
|
show 4 more comments
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
The issue is not about PDE vs ODE. There are point particle systems with gauge symmetries, and whose time evolution is not uniquely fixed by the equations of motion. And vice versa: there are field systems whose time evolution is uniquely fixed by the equations of motion (say, the heat/Schrödinger equation). The issue is about invertibility of the differential operator, equiv. about existence of a unique Green function. Obstructions may appear whether the system is one-dimensional or not.
– AccidentalFourierTransform
Nov 17 at 1:22
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
A Lagrangian of the form $L(q_1,q_2)=f(q_1-q_2)$, for arbitrary $f$, is invariant under $q_i(t)to q_i(t)+eta(t)$. The system has a gauge symmetry. I leave this to you to pick some specific $f$ and compute Euler-Lagrange. You get two redundant equations of motion, so only one independent equation for two degrees of freedom. No unique solution. Etc. (And if we are just going to cite references, let me quote Henneaux, Teitelboim "Quantization of Gauge Systems", which is a book about point particles, not fields).
– AccidentalFourierTransform
Nov 17 at 2:51
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
@AccidentalFourierTransform Oops, you're right. I meant that a function $mathbbR to mathbbR$ can't have a gauge freedom, but you can get around that by adding more variables at either end of the arrow. Edited to clarify.
– tparker
Nov 17 at 2:57
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
A system with a single degree of freedom, if it has a gauge symmetry, has no effective degrees of freedom at all. So its dynamics are purely topological and/or due to constraints. For example, a relativistic point particle, in the reparametrisation-invariant formalism, has a gauge symmetry, and it is still $mathbb Rtomathbb R$.
– AccidentalFourierTransform
Nov 17 at 3:00
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
@AccidentalFourierTransform I would describe a point particle (whether relatvistic or not) with trajectory $(t(tau), x(tau))$ as being described by a a function $mathbbR to mathbbR^2$, not $mathbbR to mathbbR$.
– tparker
Nov 17 at 3:10
|
show 4 more comments
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f441414%2fif-a-mu-is-not-determined-uniquely-by-maxwells-equations-what-happens-if-we%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
Try and simulate it yourself. Spoiler alert: you won't be able to, at least not without fixing the gauge first. Numerically solving a PDE requires, for example, inverting a matrix/solving a linear system. This doesn't work when you have gauge invariance, because the matrix is singular.
– AccidentalFourierTransform
Nov 17 at 0:04
2
@AccidentalFourierTransform This isn't quite true. Your numerics may or not converge to a solution, depending on the algorithm. Some techniques involve solving a linear system, and they'll fail, but many techniques will e.g. trivially converge to the solution $alpha equiv 0$. The issue is non-uniqueness, not non-existence.
– tparker
Nov 17 at 2:49
@tparker I never said anything about non-existence. A linear system with singular matrix has an infinite number of solutions. So we agree the issue is about non-uniqueness, not about non-existence.
– AccidentalFourierTransform
Nov 17 at 2:53
2
Related: physics.stackexchange.com/q/20071/2451
– Qmechanic♦
Nov 17 at 3:37
1
If one simulates Eq.(1) numerically on a computer, why is the field configuration at a later time not uniquely determined by the data in Eq.(2)? I don't think the assumption is true. Even though the solution is non-unique, your algorithm can converge to a particular solution. Take the ordinary equation $x^ 2=1$. If you apply the bisection method in the interval $[0,2]$, you find the solution $x=1$, although you miss $x=-1$. Other methods might not converge. So I think that without specifying a particular numerical method, answers are going to be very vague.
– jinawee
2 days ago