Riemannian optimization with nonsmooth objective function has recently drawn a lot of attention due to its wide applications in many different fields. However, algorithms for solving this important class of optimization problems have been relatively limited, as most existing algorithms for Riemannian optimization require the objective function to be smooth. In this talk, we propose a series of algorithms for solving nonsmooth optimization over the Stiefel manifold. Proposed algorithms can solve optimization over Stiefel manifold with two important types of nonsmooth objective functions: a smooth loss function plus a nonsmooth regularizer, and the composition of a nonsmooth function and a smooth mapping. Algorithms that access only a subset of the data are also discussed, which are suitable for machine learning problems with large-scale training data set. Global convergence, iteration complexity and local convergence of the proposed algorithms are established. Numerical results on sparse PCA, dictionary learning, and robust subspace recovery are reported to demonstrate the efficiency and advantages of the proposed algorithms.