Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
105 Cards in this Set
- Front
- Back
A linear transformation T : Rn ! Rm is completely
determined by its eect on columns of the n n identity matrix |
TRUE The columns on the identity matrix are the
basis vectors in Rn. Since every vector can be written as a linear combination of these, and T is a linear transformation, if we know where these columns go, we know everything. |
|
I If T : R2 ! R2
rotates vectors about the origin through an angle , then T is a linear transformation |
n. TRUE. To show
this we would show the properties of linear transformations are preserved under rotations. |
|
When two linear transformations are performed one after
another, then combined eect may not always be a linear transformation |
FALSE Again, check properties to show it is a
linear transformation. |
|
The columns of the standard matrix for a linear
transformation from R n to R m are the images of the columns of the n n identity matrix. |
TRUE
|
|
The standard matrix of a linear transformation from R
2 to R 2 that reflects points through the horizontal axis, the vertical axis, or the origin has the form a 0 0 d where a and d are +/- 1 |
TRUE We can check this by checking the images of the
basis vectors. |
|
A mapping T : Rn ! Rm is one-to-one if each vector in R
n maps onto a unique vector in Rm |
FALSE A mapping is
one-to-one if each vectors in R m is mapped to from a unique vector in Rn |
|
If A is a 3 2 matrix, then the transformation x ! Ax cannot
map R 2 onto R 3 |
TRUE You can not map a space of lower
dimension ONTO a space of higher dimension. |
|
A homogeneous equation is always consistent.
|
TRUE - The
trivial solution is always a solution. |
|
The equation Ax = 0 gives an explicit descriptions of its
solution set. |
FALSE - The equation gives an implicit
description of the solution set. |
|
The homogeneous equation Ax = 0 has the trivial solution if
and only if the equation has at least one free variable |
FALSE
- The trivial solution is always a solution to the equation Ax = 0. |
|
The equation x = p + tv describes a line through v parallel to
p |
False. The line goes through p and is parallel to v.
|
|
The solution set of Ax = b is the set of all vectors of the form
w = p + vh where vh is any solution of the equation Ax = 0 |
FALSE This is only true when there exists some vector p such
that Ap = b. |
|
If x is a nontrivial solution of Ax = 0, then every entry in x is
nonzero |
FALSE. At least one entry in x is nonzero
|
|
The equation x = x2u + x3v, with x2 and x3 free (and neither
u or v a multiple of the other), describes a plane through the origin |
True
|
|
The equation Ax = b is homogeneous if the zero vector is a
solution |
TRUE. If the zero vector is a solution then
b = Ax = A0 = 0. So the equation is Ax = 0, thus homogenous. |
|
The effect of adding p to a vector is to move the vector in the
direction parallel to p. |
TRUE. We can also think of adding p
as sliding the vector along p. |
|
The solution set of Ax = b is obtained by translating the
solution set of Ax = 0 |
FALSE. This only applies to a
consistent system. Linear Algebra, David |
|
The columns of the matrix A are linearly independent if the
equation Ax = 0 has the trivial solution. |
FALSE. The trivial
solution is always a solution |
|
If S is a linearly dependent set, then each vector is a linear
combination of the other vectors in S. |
FALSE- For example,
[1; 1] , [2; 2] and [5; 4] are linearly dependent but the last is not a linear combination of the rst two. |
|
a four by five matrix has linearly independent columns
|
TRUE. There are five columns each with four entries, thus by
Thm 8 they are linearly dependent |
|
If x and y are linearly independent, and if fx; y; zg is linearly
dependent, then z is in Spanfx; yg |
TRUE Since x and y are
linearly independent, and fx; y; zg is linearly dependent, it must be that z can be written as a linear combination of the other two, thus in in their span. |
|
Two vectors are linearly dependent if and only if they lie on a
line through the origin |
TRUE. If they lie on a line through
the origin then the origin, the zero vector, is in their span thus they are linearly dependent |
|
If a set contains fewer vectors then there are entries in the
vectors, then the set is linearly independent. |
FALSE For
example, [1; 2; 3] and [2; 4; 6] are linearly dependent. |
|
FALSE For
example, [1; 2; 3] and [2; 4; 6] are linearly dependent. |
TRUE If z is
in the Spanfx; yg then z is a linear combination of the other two, which can be rearranged to show linear dependence. |
|
I If a set in R
n is linearly dependent, then the set contains more vectors than there are entries in each vector. |
False. For
example, in R 3 [1; 2; 3] and [3; 6; 9] are linearly dependent. |
|
A linear transformation is a special type of function
|
TRUE
The properties are (i) T(u + v) = T(u) + T(v) and (ii) T(cu) = cT(u). |
|
If A is a 3 5 matrix and T is a transformation dened by
T(x) = Ax, then the domain of T is R 3 |
FALSE The domain
is R 5 |
|
If A is an m n matrix, then the range of the transformation
x 7! Ax is R m |
FALSE R
m is the codomain, the range is where we actually land. |
|
A transformation T is linear if and only if
T(c1v1 + c2v2) = c1T(v1) + c2T(v2) for all v1 and v2 in the domain of T and for all scalars c1 and c2. |
TRUE If we take
the denition of linear transformation we can derive these and if these are true then they are true for c1; c2 = 1 so the rst part of the denition is true, and if v = 0, then the second part if true. |
|
Every matrix transformation is a linear transformation.
|
TRUE
To actually show this, we would have to show all matrix transformations satisfy the two criterion of linear transformations. |
|
The codomain of the transformation x 7! Ax is the set of all linear combinations of the columns of A
|
FALSE The If A is
m n codomain is R m. The original statement in describing the range |
|
If T : Rn ! R
m is a linear transformation and if c is in Rm, then a uniqueness question is "Is c is the range of T." |
FALSE
This is an existence question. |
|
A linear transformation preserves the operations of vector
addition and scalar multiplication |
TRUE This is part of the
denition of a linear transformation |
|
The superposition principle is a physical description of a linear
transformation. |
TRUE The book says so. (page 77)
|
|
In some cases a matrix may be row reduced to more than one
matrix in reduced row echelon form, using different sequences of row operations. |
FALSE
|
|
The row reduction algorithm applies only to augmented
matrices for a linear system. |
FALSE
|
|
A basic variable in a linear system is a variable that
corresponds to a pivot column in the coecient matrix. |
TRUE
|
|
Finding a parametric description of the solution set of a linear
system is the same as solving the system. |
TRUE
|
|
If one row in an echelon form of an augmented matrix is
[0 0 0 5 0 ], then the associated linear system is inconsistent. |
FALSE
|
|
The echelon form of a matrix is unique
|
FALSE
|
|
The pivot positions in a matrix depend on whether row
interchanges are used n the row reduction process. |
FALSE
|
|
Reducing a matrix to echelon form is called the forward phase
of the row reduction process |
TRUE
|
|
Whenever a system has free variables, the solution set
contains many solutions |
FALSE
|
|
A general solution os a system is an explicit description os all
solutions of the system |
TRUE
|
|
An example of a linear combination of vectors v1 and v2 is the
vector 1=2v1 |
TRUE
|
|
The solution set of the linear system whose augmented matrix
is [a1 a2 a3 b] is the same as the solution set of the equation x1a1 + x2a2 + x3a3 = b |
TRUE
|
|
The set Span U,V is always visualized as a plane through
the origin. |
FALSE
|
|
Any list of FIve real numbers is a vector in R
5 |
TRUE
|
|
The weights c1; : : : ; cp is a linear combination
c1v1 + + cpvp cannot all be zero |
FALSE
|
|
When u and v are nonzero vectors, Span U, V contains the
line through u and the origin. |
TRUE
|
|
Asking whether the linear system corresponding to an
augmented matrix [a1 a2 a3 b] has a solution amounts to asking whether b is in Span a1; a2; a3 |
TRUE
|
|
The equation Ax = b is referred to as the vector equation.
|
FALSE
|
|
The vector b is a linear combination of the columns of a
matrix A if and only if the equation Ax = b has at least one solution |
TRUE
|
|
The equation Ax = b is consistent if the augmented matrix
[A b] has a pivot position in every row |
FALSE
|
|
The rst entry in the product Ax is a sum of products.
|
TRUE
|
|
If the columns of an m n matrix span R
m, then the equation Ax = b is consistent for each b in R m |
TRUE
|
|
I If A is an m n matrix and if the equation Ax = b is
inconsistent for some b in R m, then A cannot have a pivot position in every row. |
TRUE
|
|
Every matrix equation Ax = b corresponds to a vector
equation with the same solution set |
TRUE
|
|
Any linear combination of vectors can always be written in the
form Ax for a suitable matrix A and vector x. |
TRUE
|
|
The solution set of the linear system whose augmented matrix
is [a1 a2 a3 b] is the same as the solution set of Ax = b, if A = [a1 a2 a3 |
TRUE
|
|
If the equation Ax = b is inconsistent, then b is not in the set
spanned by the columns of A. |
TRUE
|
|
If the augmented matrix [A b] has a pivot position in every
row, then the equation Ax = b is inconsistent. |
FALSE
|
|
If A is an m n matrix whose columns do not span R
m, then the equation Ax = b is inconsistent for some b in R m |
TRUE
|
|
I If A and B are 2 2 with columns a1; a2 and b1; b2 then
AB = [a1b1; a2b2] |
FALSE Matrix multiplication is "row by
column". |
|
Each column of AB is a linear combination of the columns of
A using weights from the corresponding column of B |
FALSE
Swap A and B then its true |
|
AB + AC = A(B + C)
|
TRUE Matrix multiplication distributes
over addition. |
|
A^T + B^T = (A + B)^T
|
TRUE See properties of transposition.
Also should be able to think through to show this. When we add we add corresponding entries, these will remain corresponding entries after transposition. |
|
The transpose of a product of matrices equals the product of
their tranposes in the same order |
FALSE The transpose of a
product of matrices equals the product of their tranposes in the reverse order. Linear Algebra, David Lay Week F |
|
If A and B are 3 3 and B = [b1 b2 b3], then
AB = [Ab1 + Ab2 + Ab3]. |
FALSE This is right but there
should not be +'s in the solution. Remember the answer should also be 3 3. |
|
The second row of AB is the second row of A multiplied on
the right by B |
TRUE
|
|
(AB)C = (AC)B
|
FALSE Matrix multiplication is not
commutative |
|
The transpose of a sum of matrices equals the sum of their
transposes |
TRUE
|
|
In order for a matrix B to be the inverse of A, both equations
AB = I and BA = I must be true. |
TRUE We'll see later that
for square matrices AB=I then there is some C such that BC=I. CHALLENGE: Can you FInd an inverse for any non-square matrix. If so FInd one, if not explain why. Also in above statement about square matrices, does C=A? |
|
If A is an invertible n n matrix, then the equation Ax = b is
consistent for each b in R n . |
TRUE. Since A is invertible we
have that X=(A^-1)b |
|
Each elementary matrix is invertible
|
True. Let K be the
elementary row operation required to change the elementary matrix back into the identity. If we preform K on the identity, we get the inverse. |
|
A product of invertible n n matrices is invertible, and the
inverse of the product of their matrices in the same order. |
FALSE. It is invertible, but the inverses in the product of the
inverses in the reverse order |
|
FALSE. It is invertible, but the inverses in the product of the
inverses in the reverse order |
TRUE
|
|
I If A =
a b c d and ad = bc, then A is not invertible |
I If A =
a b c d and ad = bc, then A is not invertible |
|
If A can be row reduced to the identity matrix, then A must
be invertible. |
TRUE The algorithm presented in this chapter
tells us how to nd the inverse in this case. |
|
If A is invertible, then elementary row operations then reduce
A to to the identity also reduce A^-1 to the identity |
FALSE
They also reduce the identity to A^-1 |
|
If the equation Ax = 0 has only the trivial solution, then A is
row equivalent to the n n identity matrix. |
TRUE From Thm
8 |
|
If the columns of A span R
n , then the columns are linearly independent |
TRUE Again from Thm 8. Also is n vector span
R n they must be linearly independent. |
|
If A is an n n matrix then the equation Ax = b has at least
one solution for each b in R n . |
FALSE we need to know more
about A like if it in invertible (or anything else in Thm 8) |
|
If the equation Ax = 0 has a nontrivial solution, then A has
fewer than n pivot positions. |
TRUE This comes from the "all
false" part of THM 8. (The statements are either all true or all false.) |
|
If A^T is not invertible, then A is not invertible
|
TRUE Also
from the all false part of theorem 8 |
|
sd;lkjWEI
|
TRUE Thm 8
|
|
If the columns of A are linearly independent, then the columns
of A span R n . |
TRUE Thm 8
|
|
If the equation Ax = b has at least solution for each b in R
n , then the solution is unique for each b. |
TRUE Thm 8
|
|
If the linear transformation x 7! Ax maps R
n into R n then A has n pivot points. |
FALSE. Since A is n n the linear
transformation x 7! Ax maps R n into R n . This doesn't tell us anything about A. |
|
If there is a b in R
n such that the equation Ax = b is inconsistent, then the transformation x 7! Ax is not one-to-one. |
TRUE Thm 8
|
|
The (i; j)-cofactor of a matrix A is the matrix Aij obtained by
deleting from A its ith row and jth column. |
FALSE The
cofactor is the determinant of this Aij times -1^(i+j) |
|
The cofactor expansion of det A down a column is the
negative of the cofactor expansion along a row |
FALSE We
can expand down any row or column and get same determinant |
|
The determinant of a triangular matrix is the sum of the
entries of the main diagonal. |
FALSE It is the product of the
diagonal entries |
|
A row replacement operation does not aect the determinant
of a matrix. |
TRUE Just make sure you don't multiply the row
you are replacing by a constant. |
|
If the columns of A are linearly dependent, then det A = 0.
|
TRUE (For example there is a row without a pivot so must be
a row of all zeros.) |
|
det(A + B)= detA + detB.
|
FALSE This is true for product
however. |
|
If two row interchanges are made in succession, then the new
determinant equals the old determinant |
TRUE Both changes
multiply the determinant by -1 and -1*-1=1. |
|
The determinant of A is the product of the diagonal entries in
A. |
FALSE unless A is triangular.
|
|
If det A is zero, then two rows or two columns are the same,
or a row or a column is zero. |
FALSE The converse is true,
however. |
|
det(A^t)= -1detA
|
FALSE det(A^T)=det A when A is nxn
|
|
If f is a function in the vector space V of all real-valued
functions on R and if f(t) = 0 for some t, then f is the zero vector in V |
FALSE We need f(t) = 0 for all t.
|
|
A vector is an arrow in three-dimensional space
|
FALSE This
is an example of a vector, but there are certainly vectors not of this form. |
|
A subset H of a vector space V, is a subspace of V if the zero
vector is in H |
FALSE We also need the set to be closed under
addition and scalar multiplication. |
|
A subspace is also a vector space
|
TRUE This is the definition
of subspace, a subset that satisfies the vector space properties. |
|
Analogue signals are used in the major control systems for the
space shuttle, mentioned in the introduction to the chapter. |
FALSE Digital signals are used...
|