Jump to content
Science Forums

A question for Qfwfq (or anyone else who could answer)


mang73

Recommended Posts

The determinant is the product of the diagonal elements, as if the matrix were diagonal. The easiest way to show this by using either the first column or the bottom row for the determinant rule. The minor of the only non-zero element is also triangular, therefore.... :xx:

 

They must be using O for the null vector, it is often used. I can't do the whole proof here but 0 determinant means the matrix represents a map which isn't injective because ker isn't only the null vector, if the determinant isn't 0 then ker only has the null vector, the map is injective. This implies the solution being unique, and also the O -> O case.

 

Questions 2 and three are closely linked, an invertible map means injective and vice versa. Easy to see from the product rule that you posted. What does inverse mean? It means that if B is the invers of A then AB is the identity matrix. What's the determinant of the identity matrix? The answer follows easy!

Link to comment
Share on other sites

Hey. How are you? Well its thursday again so you have a visitor :xx:

 

Question:Let M be a square nxn matrix which is equal to its transpose. If X, Y are column n-vectors, then

 

tXMY

 

is a 1x1 matrix, which we identify with a number. Show that the map

 

(X,Y) -> tXMY

 

satisfies the three properties SP1, SP2, SP3. Give an example of a 2x2 matrix M such that the product is not positive definite.

Thank you :hihi:

Link to comment
Share on other sites

Essentially M represents an arbitrary scalar product. The standard one is when M is the identity matrix and t = 1. I guess SP1, SP2 and SP3 are the standard properties in the definition of a scalar product, which I can't remember offhand except for symmetric which easily follows from M being equal to it's transpose. I'm a bit busy so I can't say much at the moment, I'll think about it when I can. ;)

 

For the example, an easy one is:

 

1 0

0 -1

 

With this M, any vector (0, x) scalar itself will be negative, any vector (x, 0) scalar itself will be positive.

Link to comment
Share on other sites

Hi. Thanks a lot for even giving me that much considering you are busy now. I have one more that I need for 5 PM. Italy time on friday. If you can, I appreciate it.

 

Assume that the scalar product is positive definite. Let v1,....,vn be non-zero elements which are mutually perpendicular, that is <vi,vj>=0 if i is not equal to j

 

Show that they are linearly independent.

 

Thanks. ;)

Link to comment
Share on other sites

Now, this one is easy enough, I'd go for a recuctio ad absurdum, so let's "pretend" that there is a set of non-zero a1,..., an such that the linear combination LC is zero. Now, for each of the vi, consider <vi,LC> = <vi,0> = 0. Considering also the hypothesis of them being perpendicular to each other, this also means ai<vi,vi> = 0 but this implies vi = 0. As this follows with vi being any of those vectors, we have shown they must all be 0, against the hypothesis. Therefore LC can't be zero unless all the ai are.

 

Another property of scalar products is bi-linearity, was this one of the three that you meant yesterday? It's quite easy to show for the matrix M. In my vague memories, the third might be the Schwarz disequality or something but of course it depends on how one generalizes the notion of scalar product. For instance, a "true" scalar product is one such that the form <v,v> is positive definite.

Link to comment
Share on other sites

Hi! :shrug: I am back with only one question for this week. By the way I have to say with your answers not only I got full credit but in one case the grader was in "awe"!! He wrote, "slick!" Thanks for your help. As for this Q I have absolutely no idea so here it goes,(by the way there is no lambda sign so I used $ instead and in the case of $v it is NOT lambda times v. It is lambda OF v.)

 

Let V be a vector space of dimension n over the field K. Let V** be the dual space of V*. Show that each element v "element of" V gives rise to an element $ in V** and that the map v--> $v gives an isomorphism of V with V**.

 

Thank you :o

Link to comment
Share on other sites

Unfortunately I'm very busy and I had to give it a thought after work, although it isn't a difficult thing, only subtle and a bit tricky. Certainly you should use some theorems previous to this one. The set of functionals on V, V*, is a linear space and this is why it also has its dual V**. The trick is that, for a fixed v in V and for any f in V*, Fv(f) = f(v) is a scalar and it is easy to show that this expression is linear in V*:

 

Fv(af + bg) = (af + bg)(v) = af(v) + bg(v) = aFv(f) + bFv(g)

 

It's not hard to see that the map v --> Fv is one to one and onto, we also need to show that it's linear:

 

au + bv --> F[au + bv] = aFu + bFv

 

Not really too hard, for any f: F[au + bv](f) = f(au + bv) = af(u) + bf(v) = aFu(f) + bFv(f)

 

I hope it's a good enough outline.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...