Vector spaces. When in the real world are we checking if it's a vector space or not?
Clash Royale CLAN TAG#URR8PPP
up vote
13
down vote
favorite
I am reading this text:
When in the real world are we checking to see if sets are vector spaces or not? The examples above seem like really specific sets...
Are there any places where we redefined scalar multiplication like this?
linear-algebra
 |Â
show 2 more comments
up vote
13
down vote
favorite
I am reading this text:
When in the real world are we checking to see if sets are vector spaces or not? The examples above seem like really specific sets...
Are there any places where we redefined scalar multiplication like this?
linear-algebra
Here are a few examples of vector spaces: the null space and range of a linear transformation; the set of all solutions to a homogeneous linear differential equation; the set of all polynomials; the set of all continuous functions $f:[0,1] to mathbb R$; the set of all symmetric $n times n$ matrices. In each case, we must check that the axioms of a vector space are satisfied (assuming addition and scalar multiplication are defined in the obvious ways). But it's usually pretty easy to check that.
â littleO
Sep 17 at 3:52
15
Here's a motivating question: Can the surface of the sphere be turned into a vector space? Suppose I pick an origin as a point on the surface, and some reasonable-looking definitions of addition and scalar multiplication of other surface points. Can I then start doing linear algebra instead of all that complicated spherical trigonometry stuff?
â Rahul
Sep 17 at 4:22
Is physics not "real world enough" for you?
â gented
Sep 17 at 8:50
1
The nice thing about having a theory already lying around is that the "hard work" of applying its theorems to an object is as simple as proving that that object meets the required axioms. Since we have already developed linear algebra, the only thing we need to do to apply it to arbitrary objects is show that those objects can be viewed in a way amenable to linear algebra, i.e., as members of a vector space.
â BallpointBen
Sep 18 at 5:23
1
@Ovi: All I meant was that the answer to the last question in my comment is no, you can't just pick some vector-like operations and expect linear algebra to work. You have to check that they satisfy the vector space axioms, and odds are that they won't.
â Rahul
Sep 18 at 6:26
 |Â
show 2 more comments
up vote
13
down vote
favorite
up vote
13
down vote
favorite
I am reading this text:
When in the real world are we checking to see if sets are vector spaces or not? The examples above seem like really specific sets...
Are there any places where we redefined scalar multiplication like this?
linear-algebra
I am reading this text:
When in the real world are we checking to see if sets are vector spaces or not? The examples above seem like really specific sets...
Are there any places where we redefined scalar multiplication like this?
linear-algebra
linear-algebra
asked Sep 17 at 2:02
Jwan622
1,92611326
1,92611326
Here are a few examples of vector spaces: the null space and range of a linear transformation; the set of all solutions to a homogeneous linear differential equation; the set of all polynomials; the set of all continuous functions $f:[0,1] to mathbb R$; the set of all symmetric $n times n$ matrices. In each case, we must check that the axioms of a vector space are satisfied (assuming addition and scalar multiplication are defined in the obvious ways). But it's usually pretty easy to check that.
â littleO
Sep 17 at 3:52
15
Here's a motivating question: Can the surface of the sphere be turned into a vector space? Suppose I pick an origin as a point on the surface, and some reasonable-looking definitions of addition and scalar multiplication of other surface points. Can I then start doing linear algebra instead of all that complicated spherical trigonometry stuff?
â Rahul
Sep 17 at 4:22
Is physics not "real world enough" for you?
â gented
Sep 17 at 8:50
1
The nice thing about having a theory already lying around is that the "hard work" of applying its theorems to an object is as simple as proving that that object meets the required axioms. Since we have already developed linear algebra, the only thing we need to do to apply it to arbitrary objects is show that those objects can be viewed in a way amenable to linear algebra, i.e., as members of a vector space.
â BallpointBen
Sep 18 at 5:23
1
@Ovi: All I meant was that the answer to the last question in my comment is no, you can't just pick some vector-like operations and expect linear algebra to work. You have to check that they satisfy the vector space axioms, and odds are that they won't.
â Rahul
Sep 18 at 6:26
 |Â
show 2 more comments
Here are a few examples of vector spaces: the null space and range of a linear transformation; the set of all solutions to a homogeneous linear differential equation; the set of all polynomials; the set of all continuous functions $f:[0,1] to mathbb R$; the set of all symmetric $n times n$ matrices. In each case, we must check that the axioms of a vector space are satisfied (assuming addition and scalar multiplication are defined in the obvious ways). But it's usually pretty easy to check that.
â littleO
Sep 17 at 3:52
15
Here's a motivating question: Can the surface of the sphere be turned into a vector space? Suppose I pick an origin as a point on the surface, and some reasonable-looking definitions of addition and scalar multiplication of other surface points. Can I then start doing linear algebra instead of all that complicated spherical trigonometry stuff?
â Rahul
Sep 17 at 4:22
Is physics not "real world enough" for you?
â gented
Sep 17 at 8:50
1
The nice thing about having a theory already lying around is that the "hard work" of applying its theorems to an object is as simple as proving that that object meets the required axioms. Since we have already developed linear algebra, the only thing we need to do to apply it to arbitrary objects is show that those objects can be viewed in a way amenable to linear algebra, i.e., as members of a vector space.
â BallpointBen
Sep 18 at 5:23
1
@Ovi: All I meant was that the answer to the last question in my comment is no, you can't just pick some vector-like operations and expect linear algebra to work. You have to check that they satisfy the vector space axioms, and odds are that they won't.
â Rahul
Sep 18 at 6:26
Here are a few examples of vector spaces: the null space and range of a linear transformation; the set of all solutions to a homogeneous linear differential equation; the set of all polynomials; the set of all continuous functions $f:[0,1] to mathbb R$; the set of all symmetric $n times n$ matrices. In each case, we must check that the axioms of a vector space are satisfied (assuming addition and scalar multiplication are defined in the obvious ways). But it's usually pretty easy to check that.
â littleO
Sep 17 at 3:52
Here are a few examples of vector spaces: the null space and range of a linear transformation; the set of all solutions to a homogeneous linear differential equation; the set of all polynomials; the set of all continuous functions $f:[0,1] to mathbb R$; the set of all symmetric $n times n$ matrices. In each case, we must check that the axioms of a vector space are satisfied (assuming addition and scalar multiplication are defined in the obvious ways). But it's usually pretty easy to check that.
â littleO
Sep 17 at 3:52
15
15
Here's a motivating question: Can the surface of the sphere be turned into a vector space? Suppose I pick an origin as a point on the surface, and some reasonable-looking definitions of addition and scalar multiplication of other surface points. Can I then start doing linear algebra instead of all that complicated spherical trigonometry stuff?
â Rahul
Sep 17 at 4:22
Here's a motivating question: Can the surface of the sphere be turned into a vector space? Suppose I pick an origin as a point on the surface, and some reasonable-looking definitions of addition and scalar multiplication of other surface points. Can I then start doing linear algebra instead of all that complicated spherical trigonometry stuff?
â Rahul
Sep 17 at 4:22
Is physics not "real world enough" for you?
â gented
Sep 17 at 8:50
Is physics not "real world enough" for you?
â gented
Sep 17 at 8:50
1
1
The nice thing about having a theory already lying around is that the "hard work" of applying its theorems to an object is as simple as proving that that object meets the required axioms. Since we have already developed linear algebra, the only thing we need to do to apply it to arbitrary objects is show that those objects can be viewed in a way amenable to linear algebra, i.e., as members of a vector space.
â BallpointBen
Sep 18 at 5:23
The nice thing about having a theory already lying around is that the "hard work" of applying its theorems to an object is as simple as proving that that object meets the required axioms. Since we have already developed linear algebra, the only thing we need to do to apply it to arbitrary objects is show that those objects can be viewed in a way amenable to linear algebra, i.e., as members of a vector space.
â BallpointBen
Sep 18 at 5:23
1
1
@Ovi: All I meant was that the answer to the last question in my comment is no, you can't just pick some vector-like operations and expect linear algebra to work. You have to check that they satisfy the vector space axioms, and odds are that they won't.
â Rahul
Sep 18 at 6:26
@Ovi: All I meant was that the answer to the last question in my comment is no, you can't just pick some vector-like operations and expect linear algebra to work. You have to check that they satisfy the vector space axioms, and odds are that they won't.
â Rahul
Sep 18 at 6:26
 |Â
show 2 more comments
2 Answers
2
active
oldest
votes
up vote
17
down vote
accepted
Vector spaces, and vector space-like sets, crop up in a lot of places. The specific example here is artificial, yes, but it is meant to show that we need to be careful when checking whether a set of vector-like objects, together with two operations (vector addition and scalar multiplication), actually is a vector space or not.
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
2
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
add a comment |Â
up vote
54
down vote
You'll never have to prove something is a vector space in real life. You may want to prove something is a vector space, because vector spaces have a simply enormous amount of theory proven about them, and the non-trivial fact that you wish to establish about your specific object might boil down to a much more general, well-known result about vector spaces.
Here's my favourite example. It's still a little artificial, but I came by it through simple curiousity, rather than any course, or the search for an example.
Consider the logic puzzle here. It's a classic. You have a $5 times 5$ grid of squares, coloured black or white. Every time you press a square, it changes the square and the (up to four) adjacent squares from black to white or white to black. Your job is to press squares in such a way that you end up with every square being white.
So, my question to you is, can you form any configuration of white and black squares by pressing these squares? Put another way, is any $5 times 5$ grid of black and white squares a valid puzzle with a valid solution?
Well, it turns out that this can be easily answered using linear algebra. We form a vector space of $5 times 5$ matrices whose entries are in the field $mathbbZ_2 = lbrace 0, 1 rbrace$. We represent white squares with $0$ and black squares with $1$. Such a vector space is finite, and contains $2^25$ vectors. Note that every vector is its own additive inverse (as is the case for any vector space over $mathbbZ_2$).
Also note that the usual standard basis, consisting of matrices with $0$ everywhere, except for a single $1$ in one entry, forms a basis for our vector space. Therefore, the dimension of the space is $25$.
Pressing each square corresponds to adding one of $25$ vectors to the current vector. For example, pressing the top left square will add the vector
$$beginpmatrix
1 & 1 & 0 & 0 & 0 \
1 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0
endpmatrix.$$
We are trying to find, therefore, a linear combination of these $25$ vectors that will sum to the current vector (remember $-v = v$ for all our vectors $v$).
So, my question that I posed to you, boils down to asking whether these $25$ vectors span the $25$-dimensional space. Due to standard results in finite-dimensional linear algebra, this is equivalent to asking whether the set of $25$ vectors is linearly independent.
The answer is, no, they are not linearly independent. In particular, if you press the buttons highlighted in the following picture, you'll obtain the white grid again, i.e. the additive identity.
Therefore, we have a non-trivial linear combination of the vectors, so they are linearly dependent, and hence not spanning. That is, there must exist certain configurations that cannot be attained, i.e. there are invalid puzzles with no solution.
The linear dependency I found while playing the game myself, and noticing some of the asymmetries of the automatically generated solutions, even when the problem itself was symmetric. Proving the linear dependence is as easy as showing the above picture. I still don't know of an elegant way to find an example of a puzzle that can't be solved though! So, my proof is somewhat non-constructive, and very easy if you know some linear algebra, and are willing to prove that a set is a vector space.
5
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
 |Â
show 2 more comments
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
17
down vote
accepted
Vector spaces, and vector space-like sets, crop up in a lot of places. The specific example here is artificial, yes, but it is meant to show that we need to be careful when checking whether a set of vector-like objects, together with two operations (vector addition and scalar multiplication), actually is a vector space or not.
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
2
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
add a comment |Â
up vote
17
down vote
accepted
Vector spaces, and vector space-like sets, crop up in a lot of places. The specific example here is artificial, yes, but it is meant to show that we need to be careful when checking whether a set of vector-like objects, together with two operations (vector addition and scalar multiplication), actually is a vector space or not.
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
2
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
add a comment |Â
up vote
17
down vote
accepted
up vote
17
down vote
accepted
Vector spaces, and vector space-like sets, crop up in a lot of places. The specific example here is artificial, yes, but it is meant to show that we need to be careful when checking whether a set of vector-like objects, together with two operations (vector addition and scalar multiplication), actually is a vector space or not.
Vector spaces, and vector space-like sets, crop up in a lot of places. The specific example here is artificial, yes, but it is meant to show that we need to be careful when checking whether a set of vector-like objects, together with two operations (vector addition and scalar multiplication), actually is a vector space or not.
answered Sep 17 at 2:05
vadim123
74.7k896186
74.7k896186
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
2
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
add a comment |Â
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
2
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
Oh okay, so the examples given are artificial? Like we're often not checking vector spaces and vector space like sets that are this specific right?
â Jwan622
Sep 17 at 3:22
2
2
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
They will be specific, but typically considerably more complicated (and harder to check the vector space axioms).
â vadim123
Sep 17 at 3:49
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
@Jwan622: You will often be wanting to check that things are vector spaces. TheyâÂÂll usually be more natural and less contrived than this example, but they can certainly be quite specific, and more complicated than this one.
â Peter LeFanu Lumsdaine
Sep 18 at 7:02
add a comment |Â
up vote
54
down vote
You'll never have to prove something is a vector space in real life. You may want to prove something is a vector space, because vector spaces have a simply enormous amount of theory proven about them, and the non-trivial fact that you wish to establish about your specific object might boil down to a much more general, well-known result about vector spaces.
Here's my favourite example. It's still a little artificial, but I came by it through simple curiousity, rather than any course, or the search for an example.
Consider the logic puzzle here. It's a classic. You have a $5 times 5$ grid of squares, coloured black or white. Every time you press a square, it changes the square and the (up to four) adjacent squares from black to white or white to black. Your job is to press squares in such a way that you end up with every square being white.
So, my question to you is, can you form any configuration of white and black squares by pressing these squares? Put another way, is any $5 times 5$ grid of black and white squares a valid puzzle with a valid solution?
Well, it turns out that this can be easily answered using linear algebra. We form a vector space of $5 times 5$ matrices whose entries are in the field $mathbbZ_2 = lbrace 0, 1 rbrace$. We represent white squares with $0$ and black squares with $1$. Such a vector space is finite, and contains $2^25$ vectors. Note that every vector is its own additive inverse (as is the case for any vector space over $mathbbZ_2$).
Also note that the usual standard basis, consisting of matrices with $0$ everywhere, except for a single $1$ in one entry, forms a basis for our vector space. Therefore, the dimension of the space is $25$.
Pressing each square corresponds to adding one of $25$ vectors to the current vector. For example, pressing the top left square will add the vector
$$beginpmatrix
1 & 1 & 0 & 0 & 0 \
1 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0
endpmatrix.$$
We are trying to find, therefore, a linear combination of these $25$ vectors that will sum to the current vector (remember $-v = v$ for all our vectors $v$).
So, my question that I posed to you, boils down to asking whether these $25$ vectors span the $25$-dimensional space. Due to standard results in finite-dimensional linear algebra, this is equivalent to asking whether the set of $25$ vectors is linearly independent.
The answer is, no, they are not linearly independent. In particular, if you press the buttons highlighted in the following picture, you'll obtain the white grid again, i.e. the additive identity.
Therefore, we have a non-trivial linear combination of the vectors, so they are linearly dependent, and hence not spanning. That is, there must exist certain configurations that cannot be attained, i.e. there are invalid puzzles with no solution.
The linear dependency I found while playing the game myself, and noticing some of the asymmetries of the automatically generated solutions, even when the problem itself was symmetric. Proving the linear dependence is as easy as showing the above picture. I still don't know of an elegant way to find an example of a puzzle that can't be solved though! So, my proof is somewhat non-constructive, and very easy if you know some linear algebra, and are willing to prove that a set is a vector space.
5
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
 |Â
show 2 more comments
up vote
54
down vote
You'll never have to prove something is a vector space in real life. You may want to prove something is a vector space, because vector spaces have a simply enormous amount of theory proven about them, and the non-trivial fact that you wish to establish about your specific object might boil down to a much more general, well-known result about vector spaces.
Here's my favourite example. It's still a little artificial, but I came by it through simple curiousity, rather than any course, or the search for an example.
Consider the logic puzzle here. It's a classic. You have a $5 times 5$ grid of squares, coloured black or white. Every time you press a square, it changes the square and the (up to four) adjacent squares from black to white or white to black. Your job is to press squares in such a way that you end up with every square being white.
So, my question to you is, can you form any configuration of white and black squares by pressing these squares? Put another way, is any $5 times 5$ grid of black and white squares a valid puzzle with a valid solution?
Well, it turns out that this can be easily answered using linear algebra. We form a vector space of $5 times 5$ matrices whose entries are in the field $mathbbZ_2 = lbrace 0, 1 rbrace$. We represent white squares with $0$ and black squares with $1$. Such a vector space is finite, and contains $2^25$ vectors. Note that every vector is its own additive inverse (as is the case for any vector space over $mathbbZ_2$).
Also note that the usual standard basis, consisting of matrices with $0$ everywhere, except for a single $1$ in one entry, forms a basis for our vector space. Therefore, the dimension of the space is $25$.
Pressing each square corresponds to adding one of $25$ vectors to the current vector. For example, pressing the top left square will add the vector
$$beginpmatrix
1 & 1 & 0 & 0 & 0 \
1 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0
endpmatrix.$$
We are trying to find, therefore, a linear combination of these $25$ vectors that will sum to the current vector (remember $-v = v$ for all our vectors $v$).
So, my question that I posed to you, boils down to asking whether these $25$ vectors span the $25$-dimensional space. Due to standard results in finite-dimensional linear algebra, this is equivalent to asking whether the set of $25$ vectors is linearly independent.
The answer is, no, they are not linearly independent. In particular, if you press the buttons highlighted in the following picture, you'll obtain the white grid again, i.e. the additive identity.
Therefore, we have a non-trivial linear combination of the vectors, so they are linearly dependent, and hence not spanning. That is, there must exist certain configurations that cannot be attained, i.e. there are invalid puzzles with no solution.
The linear dependency I found while playing the game myself, and noticing some of the asymmetries of the automatically generated solutions, even when the problem itself was symmetric. Proving the linear dependence is as easy as showing the above picture. I still don't know of an elegant way to find an example of a puzzle that can't be solved though! So, my proof is somewhat non-constructive, and very easy if you know some linear algebra, and are willing to prove that a set is a vector space.
5
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
 |Â
show 2 more comments
up vote
54
down vote
up vote
54
down vote
You'll never have to prove something is a vector space in real life. You may want to prove something is a vector space, because vector spaces have a simply enormous amount of theory proven about them, and the non-trivial fact that you wish to establish about your specific object might boil down to a much more general, well-known result about vector spaces.
Here's my favourite example. It's still a little artificial, but I came by it through simple curiousity, rather than any course, or the search for an example.
Consider the logic puzzle here. It's a classic. You have a $5 times 5$ grid of squares, coloured black or white. Every time you press a square, it changes the square and the (up to four) adjacent squares from black to white or white to black. Your job is to press squares in such a way that you end up with every square being white.
So, my question to you is, can you form any configuration of white and black squares by pressing these squares? Put another way, is any $5 times 5$ grid of black and white squares a valid puzzle with a valid solution?
Well, it turns out that this can be easily answered using linear algebra. We form a vector space of $5 times 5$ matrices whose entries are in the field $mathbbZ_2 = lbrace 0, 1 rbrace$. We represent white squares with $0$ and black squares with $1$. Such a vector space is finite, and contains $2^25$ vectors. Note that every vector is its own additive inverse (as is the case for any vector space over $mathbbZ_2$).
Also note that the usual standard basis, consisting of matrices with $0$ everywhere, except for a single $1$ in one entry, forms a basis for our vector space. Therefore, the dimension of the space is $25$.
Pressing each square corresponds to adding one of $25$ vectors to the current vector. For example, pressing the top left square will add the vector
$$beginpmatrix
1 & 1 & 0 & 0 & 0 \
1 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0
endpmatrix.$$
We are trying to find, therefore, a linear combination of these $25$ vectors that will sum to the current vector (remember $-v = v$ for all our vectors $v$).
So, my question that I posed to you, boils down to asking whether these $25$ vectors span the $25$-dimensional space. Due to standard results in finite-dimensional linear algebra, this is equivalent to asking whether the set of $25$ vectors is linearly independent.
The answer is, no, they are not linearly independent. In particular, if you press the buttons highlighted in the following picture, you'll obtain the white grid again, i.e. the additive identity.
Therefore, we have a non-trivial linear combination of the vectors, so they are linearly dependent, and hence not spanning. That is, there must exist certain configurations that cannot be attained, i.e. there are invalid puzzles with no solution.
The linear dependency I found while playing the game myself, and noticing some of the asymmetries of the automatically generated solutions, even when the problem itself was symmetric. Proving the linear dependence is as easy as showing the above picture. I still don't know of an elegant way to find an example of a puzzle that can't be solved though! So, my proof is somewhat non-constructive, and very easy if you know some linear algebra, and are willing to prove that a set is a vector space.
You'll never have to prove something is a vector space in real life. You may want to prove something is a vector space, because vector spaces have a simply enormous amount of theory proven about them, and the non-trivial fact that you wish to establish about your specific object might boil down to a much more general, well-known result about vector spaces.
Here's my favourite example. It's still a little artificial, but I came by it through simple curiousity, rather than any course, or the search for an example.
Consider the logic puzzle here. It's a classic. You have a $5 times 5$ grid of squares, coloured black or white. Every time you press a square, it changes the square and the (up to four) adjacent squares from black to white or white to black. Your job is to press squares in such a way that you end up with every square being white.
So, my question to you is, can you form any configuration of white and black squares by pressing these squares? Put another way, is any $5 times 5$ grid of black and white squares a valid puzzle with a valid solution?
Well, it turns out that this can be easily answered using linear algebra. We form a vector space of $5 times 5$ matrices whose entries are in the field $mathbbZ_2 = lbrace 0, 1 rbrace$. We represent white squares with $0$ and black squares with $1$. Such a vector space is finite, and contains $2^25$ vectors. Note that every vector is its own additive inverse (as is the case for any vector space over $mathbbZ_2$).
Also note that the usual standard basis, consisting of matrices with $0$ everywhere, except for a single $1$ in one entry, forms a basis for our vector space. Therefore, the dimension of the space is $25$.
Pressing each square corresponds to adding one of $25$ vectors to the current vector. For example, pressing the top left square will add the vector
$$beginpmatrix
1 & 1 & 0 & 0 & 0 \
1 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0 \
0 & 0 & 0 & 0 & 0
endpmatrix.$$
We are trying to find, therefore, a linear combination of these $25$ vectors that will sum to the current vector (remember $-v = v$ for all our vectors $v$).
So, my question that I posed to you, boils down to asking whether these $25$ vectors span the $25$-dimensional space. Due to standard results in finite-dimensional linear algebra, this is equivalent to asking whether the set of $25$ vectors is linearly independent.
The answer is, no, they are not linearly independent. In particular, if you press the buttons highlighted in the following picture, you'll obtain the white grid again, i.e. the additive identity.
Therefore, we have a non-trivial linear combination of the vectors, so they are linearly dependent, and hence not spanning. That is, there must exist certain configurations that cannot be attained, i.e. there are invalid puzzles with no solution.
The linear dependency I found while playing the game myself, and noticing some of the asymmetries of the automatically generated solutions, even when the problem itself was symmetric. Proving the linear dependence is as easy as showing the above picture. I still don't know of an elegant way to find an example of a puzzle that can't be solved though! So, my proof is somewhat non-constructive, and very easy if you know some linear algebra, and are willing to prove that a set is a vector space.
answered Sep 17 at 3:36
Theo Bendit
14.5k12045
14.5k12045
5
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
 |Â
show 2 more comments
5
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
5
5
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
If you want to read more about this, it's known more commonly as Lights Out. The Wikipedia page also describes the mathematical analysis of the game using, indeed, linear algebra on Z2.
â Rahul
Sep 17 at 4:07
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
@Rahul Thank you for the link! I've long suspected that there were only four solutions to any viable $5 times 5$ problem, but I've never found an elegant way to prove it, so I haven't bothered. I've also shown that the $4 times 5$ case is linearly independent/spanning. I've also wondered if there's a characterisation of linear independence of the general $n times m$ system, but I haven't properly investigated it.
â Theo Bendit
Sep 17 at 4:12
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
Well, if you work from top to bottom it is obvious that to solve a given puzzle you only have to fix the first row and everything else will be also fixed. So to find a puzzle that cannot be solved, you just have to go through all possible first rows (32 of them; less if you mod the first row in your additive identity) and find out which last rows are impossible. Similar reasoning tells you that every impossible state is equal to some possible state plus one of those impossible last rows.
â user21820
Sep 17 at 6:44
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
But there is a generalized variant that does not admit the analysis in my previous comment.
â user21820
Sep 17 at 6:57
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
+1 Very cool! $$
â Ovi
Sep 18 at 5:23
 |Â
show 2 more comments
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2919706%2fvector-spaces-when-in-the-real-world-are-we-checking-if-its-a-vector-space-or%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Here are a few examples of vector spaces: the null space and range of a linear transformation; the set of all solutions to a homogeneous linear differential equation; the set of all polynomials; the set of all continuous functions $f:[0,1] to mathbb R$; the set of all symmetric $n times n$ matrices. In each case, we must check that the axioms of a vector space are satisfied (assuming addition and scalar multiplication are defined in the obvious ways). But it's usually pretty easy to check that.
â littleO
Sep 17 at 3:52
15
Here's a motivating question: Can the surface of the sphere be turned into a vector space? Suppose I pick an origin as a point on the surface, and some reasonable-looking definitions of addition and scalar multiplication of other surface points. Can I then start doing linear algebra instead of all that complicated spherical trigonometry stuff?
â Rahul
Sep 17 at 4:22
Is physics not "real world enough" for you?
â gented
Sep 17 at 8:50
1
The nice thing about having a theory already lying around is that the "hard work" of applying its theorems to an object is as simple as proving that that object meets the required axioms. Since we have already developed linear algebra, the only thing we need to do to apply it to arbitrary objects is show that those objects can be viewed in a way amenable to linear algebra, i.e., as members of a vector space.
â BallpointBen
Sep 18 at 5:23
1
@Ovi: All I meant was that the answer to the last question in my comment is no, you can't just pick some vector-like operations and expect linear algebra to work. You have to check that they satisfy the vector space axioms, and odds are that they won't.
â Rahul
Sep 18 at 6:26