A process is developed to assess the effect of fusing polarimetric and spectral sensing modalities for an urban target detection scenario through simulation with the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model. Two novel multimodal fusion algorithms are proposed--one for the pixel level, and another for the decision level. A synthetic urban scene is validated to ensure the presence of enough background clutter. The signal-to-clutter ratios (SCR) of both the simulated spectral and polarimetric data are calculated, and the synthetic spectral SCR is compared to data collected with the Compact Airborne Spectral Sensor (COMPASS). A qualitative examination of the polarimetric background clutter level is also described. The fusion algorithms' performances are evaluated at 355 different sun-target-sensor viewing geometries, and a method to quantify the increase in performance is described. Tasking conditions where target detection performance is enhanced are identified and the decision fusion algorithm is shown to outperform the pixel fusion algorithm. The utility of polarimetric information is shown to vary with the sun-target-sensor geometry, but data fusion consistently enhances spectral target detection performance when the sensor is located in the sun's specular reflection lobe.