Vision-based human identification at a distance in surveillance has recently gained more attentions. Gait has the advantages of being non-invasive and difficult to conceal, and is also the only perceivable biometric at a distance. This paper introduces a novel feature representation method for gait analysis and recognition applications. The method includes following steps: first, silhouette extraction is performed for each image sequence. Secondly, the distributions of sampled points from the human local silhouette are analyzed and the gait cycle is detected by a histogram-based approach. Thirdly, by tiling and dispersing all image frames across one gait cycle in a two-dimensional plane along a ring frame by frame at a fixed interval, a contextual stances appearance model is built. The gait appearance model consists of the structural information of the individual silhouette and contextual silhouettes centered at the current frame in the polar-plane. With a designed invariant histogram-based descriptor, the gait appearance characteristics are described as a sequence of shape distributions. These distributions are finally used to achieve gait recognition based on Jeffrey divergence matching criterion and dynamic time warping technology. Recognition capability is illustrated by an 87.59% CCR on Soton database and the result shows that our approach outperforms existing methods.