Tensors may be considered as a generalization of vectors and covectors. They are extremely important quantities for studying differential geometry and physics.
Let M^n be an n-dimensional differentiable manifold. For each x\in M^n, let E_x=T_xM^n, i.e. the tangent space to M^n at x. We denote the canonical basis of E by \partial=\left(\frac{\partial}{\partial x^1},\cdots,\frac{\partial}{\partial x^n}\right) and its dual basis by \sigma=dx=(dx^1,\cdots,dx^n), where x^1,\cdots,x^n are local coordinates. The canonical basis \frac{\partial}{\partial x^1},\cdots,\frac{\partial}{\partial x^1} also simply denoted by \partial_1,\cdots,\partial_n.
Covariant Tensors
Definition. A covariant tensor of rank r is a multilinear real-valued function
Q:E\times E\times\cdots\times E\longrightarrow\mathbb{R}
of r-tuples of vectors. A covariant tensor of rank r is also called a tensor of type (0,r) or shortly (0,r)-tensor. Note that the values of Q must be independent of the basis in which the components of the vectors are expressed. A covariant vector (also called covector or a 1-form) is a covariant tensor of rank 1. An important of example of covariant tensor of rank 2 is the metric tensor G:
G(v,w)=\langle v,w\rangle=\sum_{i,j}g_{ij}v^iw^j.
In componenents, by multilinearity
\begin{align*}
Q(v_1\cdots,v_r)&=Q\left(\sum_{i_1}v_1^{i_1}\partial_{i_1},\cdots,\sum_{i_r}v_r^{i_r}\partial_{i_r}\right)\\
&=\sum_{i_1,\cdots,i_r}v_1^{i_1}\cdots v_r^{i_r}Q(\partial_{i_1},\cdots,\partial_{i_r}).
\end{align*}
Denote Q(\partial_{i_1},\cdots,\partial_{i_r}) by Q_{i_1,\cdots,i_r}. Then
Q(v_1\cdots,v_r)=\sum_{i_1,\cdots,i_r}Q_{i_1,\cdots,i_r}v_1^{i_1}\cdots v_r^{i_r}.\ \ \ \ \ \mbox{(1)}
Using the Einstein’s convention, (1) can be shortly written as
Q(v_1\cdots,v_r)=Q_{i_1,\cdots,i_r}v_1^{i_1}\cdots v_r^{i_r}.
The set of all covariant tensors of rank r forms a vector space over \mathbb{R}. The number of components in such a tensor is n^r. The vector space of all covariant r-th rank tensors is denoted by
E^\ast\otimes E^\ast\otimes\cdots\otimes E^\ast=\otimes^r E^\ast.
If \alpha,\beta\in E^\ast, i.e. covectors, we can form the 2nd rank covariant tensor, the tensor product \alpha\otimes\beta of \alpha and \beta: Define \alpha\otimes\beta: E\times E\longrightarrow\mathbb{R} by
\alpha\otimes\beta(v,w)=\alpha(v)\beta(w).
If we write \alpha=a_idx^i and \beta=b_jdx^j, then
(\alpha\otimes\beta)_{ij}=\alpha\otimes\beta(\partial_i,\partial_j)=\alpha(\partial_i)\beta(\partial_j)=a_ib_j.
Contravariant Tensors
A contravariant vector, i.e. an element of E can be considered as a linear functional v: E^\ast\longrightarrow\mathbb{R} defined by
v(\alpha)=\alpha(v)=a_iv^i,\ \alpha=a_idx^i\in E^\ast.
Definition. A contravariant tensor of rank s is a multilinear real-valued function T on s-tuples of covectors
T:E^\ast\times E^\ast\times\cdots\times E^\ast\longrightarrow\mathbb{R}. A contravariant tensor of rank s is also called a tensor of type (s,0) or shortly (s,0)-tensor.
For 1-forms \alpha_1,\cdots,\alpha_s
T(\alpha_1,\cdots,\alpha_s)=a_{1_{i_1}}\cdots a_{s_{i_s}}T^{i_1\cdots i_s}
where
T^{i_1\cdots i_s}:=T(dx^{i_1},\cdots,dx^{i_s}).
The space of all contravariant tensors of rank s is denoted by
E\otimes E\otimes\cdots\otimes E:=\otimes^s E.
Contravariant vectors are contravariant tensors of rank 1. An example of a contravariant tensor of rank 2 is the inverse of the metric tensor G^{-1}=(g^{ij}):
G^{-1}(\alpha,\beta)=g^{ij}a_ib_j.
Given a pair v,w of contravariant vectors, we can form the tensor product v\otimes w in the same manner as we did for covariant vectors. It is the 2nd rank contravariant tensor with components (v\otimes w)^{ij}=v^jw^j. The metric tensor G and its inverse G^{-1} may be written as
G=g_{ij}dx^i\otimes dx^j\ \mbox{and}\ G^{-1}=g^{ij}\partial_i\otimes\partial_j.
Mixed Tensors
Definition. A mixed tensor, r times covariant and s times contravariant, is a real multilinear function W
W: E^\ast\times E^\ast\times\cdots\times E^\ast\times E\times E\times\cdots\times E\longrightarrow\mathbb{R}
on s-tuples of covectors and r-tuples of vectors. It is also called a tensor of type (s,r) or simply (s,r)-tensor. By multilinearity
W(\alpha_1,\cdots,\alpha_s, v_1,\cdots, v_r)=a_{1_{i_1}}\cdots a_{s_{i_s}}W^{i_1\cdots i_s}{}_{j_1\cdots j_r}v_1^{j_1}\cdots v_r^{j_r}
where
W^{i_1\cdots i_s}{}_{j_1\cdots j_r}:=W(dx^{i_1},\cdots,dx^{i_s},\partial_{j_1},\cdots,\partial_{j_r}).
A 2nd rank mixed tensor may arise from a linear operator A: E\longrightarrow E. Define W_A: E^\ast\times E\longrightarrow\mathbb{R} by W_A(\alpha,v)=\alpha(Av). Let A=(A^i{}_j) be the matrix associated with A, i.e. A(\partial_j)=\partial_i A^i{}_j. Let us calculate the component of W_A:
W_A^i{}_j=W_A(dx^i,\partial_j)=dx^i(A(\partial_j))=dx^i(\partial_kA^k{}_j)=\delta^i_kA^k{}_j=A^i{}_j.
So the matrix of the mixed tensor W_A is just the matrix associated with A. Conversely, given a mixed tensotr W, once convariant and once contravariant, we can define a linear transformation A such that W(\alpha,v)=\alpha(A,v). We do not distinguish between a linear transformation A and its associated mixed tensor W_A. In components, W(\alpha,v) is written as
W(\alpha,v)=a_iA^i{}_jv^j=aAv.
The tensor product w\otimes\beta of a vector and a covector is the mixed tensor defined by
(w\otimes\beta)(\alpha,v)=\alpha(w)\beta(v). The associated transformation is can be written as
A=A^i{}_j\partial_i\otimes dx^j=\partial_i\otimes A^i{}_jdx^j.
For math undergraduates, different ways of writing indices (raising, lowering, and mixed) in tensor notations can be very confusing. Main reason is that in standard math courses such as linear algebra or elementary differential geometry (classical differential geometry of curves and surfaces in \mathbb{E}^3) the matrix of a linear transformation is usually written as A_{ij}. Physics undergraduates don’t usually get a chance to learn tensors in undergraduate physics courses. In order to study more advanced differential geometry or physics such as theory of special and general relativity, and field theory one must be able to distinguish three different ways of writing matrices A_{ij}, A^{ij}, and A^i{}_j. To summarize, A_{ij} and A^{ij} are bilinear forms on E and E^\ast, respectively that are defined by
A_{ij}v^iv^j\ \mbox{and}\ A^{ij}a_ib_j\ (\mbox{respectively}). A^i{}_j is the matrix of a linear transformation A: E\longrightarrow E.
Let (E,\langle\ ,\ \rangle) be an inner product space. Given a linear transformation A: E\longrightarrow E (i.e. a mixed tensor), one can associate a bilinear covariant bilinear form A’ by
A'(v,w):=\langle v,Aw\rangle=v^ig_{ij}A^j{}_k w^k. So we see that the matrix of A’ is
A’_{ik}=g_{ij}A^j{}_k. The process can be said as “we lower the index j, making it a k, by mans of the metric tensor g_{ij}.” In tensor analysis one uses the same letter, i.e. instead of A’, one writes
A_{ik}:=g_{ij}A^j{}_k. This is clearly a covariant tensor. In general, the components of the associated covariant tensor A_{ik} differ from those of the mixed tensor A^i{}_j. But if the basis is orthonormal, i.e. g_{ij}=\delta^i_j then they coincide. That is the reason why we simply write A_{ij} without making any distiction in linear algebra or in elementary differential geometry.
Similarly, one may associate to the linear transformation A a contravariant bilinear form
\bar A(\alpha,\beta)=a_iA^i{}_jg^{jk}b_k whose matrix components can be written as
A^{ik}=A^i{}_jg^{jk}.
Note that the metric tensor g_{ij} represents a linear map from E to E^\ast, sending the vector with components v^j into the covector with components g_{ij}v^j. In quantum mechanics, the covector g_{ij}v^j is denoted by \langle v| and called a bra vector, while the vector v^j is denoted by |v\rangle and called a ket vector. Usually the inner product on E
\langle\ ,\ \rangle:E\times E\longrightarrow\mathbb{R};\ \langle v,w\rangle=g_{ij}v^iw^j is considered as a covariant tensor of rank 2. But in quantum mechanics \langle v,w\rangle is not considered as a covariant tensor g_{ij} of rank 2 acting on a pair of vectors (v,w), rather it is regarded as the braket \langle v|w\rangle, a bra vector \langle v| acting on a ket vector |w\rangle.