A theory of the fluctuation-induced Nernst effect is developed for a two-dimensional superconductor in a perpendicular magnetic field. First, we derive a simple phenomenological formula for the Nernst coefficient, which naturally explains the giant Nernst signal due to fluctuating Cooper pairs. The latter signal is shown to be large even far from the transition and may exceed by orders of magnitude the Fermi liquid terms. We also present a complete microscopic calculation of the Nernst coefficient for arbitrary magnetic fields and temperatures, which is based on the Matsubara-Kubo formalism. It is shown that the magnitude and the behavior of the Nernst signal observed experimentally in disordered superconducting films can be well understood on the basis of superconducting fluctuation theory.