A low-rank and sparse matrix decomposition (LRaSMD) detector has been proposed to detect anomalies in hyperspectral imagery (HSI). The detector assumes background images are low-rank while anomalies are gross errors that are sparsely distributed throughout the image scene. By solving a constrained convex optimization problem, the LRaSMD detector separates the anomalies from the background. This protects the background model from corruption. An anomaly value for each pixel is calculated using the Euclidean distance, and anomalies are determined by thresholding the anomaly value. Four groups of experiments on three widely used HSI datasets are designed to completely analyze the performances of the new detector. Experimental results show that the LRaSMD detector outperforms the global Reed-Xiaoli (GRX), the orthogonal subspace projection-GRX, and the cluster-based detectors. Moreover, the results show that LRaSMD achieves equal or better detection performance than the local support vector data description detector within a shorter computational time.