My original solution is "airtight". The only assumptions are those that are given, the v's are dependent, and the w's have the given definitions. I do not need the assumption that the v's are all non-zero, although the case when they are all 0 is vacuous, but the proof is still valid.
Not quite. You wrote
Since the v's are linearly dependent, there is a set of non-zero c's so that c1v1+ c2v2+c3v3=0
That's not the definition of linear dependency, so your proof uses an unverified assumption, so it is invalid.
What's wrong with the definition?http://en.wikipedia.org/wiki/Linear_independence
(Quotation from above)Definition
A finite subset of n vectors, v1, v2, ..., vn, from the vector space V, is linearly dependent if and only if there exists a set of n scalars, a1, a2, ..., an, not all zero, such that
a1v1 + a2v2 + ... + anvn = 0.
Note that the zero on the right is the zero vector, not the number zero.