Linear regression with high uncertainties in the measurements, model structure and model permanence is a major challenging problem. Standard regression techniques are based on optimizing a certain performance criterion, usually the mean squared error, and are highly sensitive to uncertainties. Regularization methods have been developed to address the problem of measurement uncertainty, but choosing the regularization parameter under severe uncertainties is problematic. Here we develop an alternative regression methodology based on satisficing rather than optimizing the performance criterion while maximizing the robustness to uncertainties. Uncertainties are represented by info-gap models which entail an unbounded family of nested sets of measurements parameterized by a non-probabilistic horizon of uncertainty. We prove and demonstrate that the robust-satisficing solution is different from the optimal least squares solution and that the infogap approach can provide higher robustness to uncertainty.