While it may be reasonable to design algorithms that first perturb their inputs, and the mathematics used in smoothed analysis may help analyze such algorithms, this is not the goal of smoothed analysis.
Smoothed analysis is offered as an alternative to worst-case and average-case analyses. Thus, it provides an analysis of the complexity of an algorithm assuming that its inputs are subject to perturbation in their formation.
For a detailed explanation of why, see the second-to-last paragraph of page 9 of Smoothed Analysis of Algorithms: Why The Simplex Algorithm Usually Takes Polynomial Time.
The short answer is that the probability distributions considered in smoothed analysis have exponentially small overlap in probability mass with the distributions considered in average-case analysis. So, it could be the case that the smoothed complexity of an algorithm is exponential while the average complexity is polynomial, and has polynomial moments. In fact, stronger statements are true: this exponential behavior can have a negligible effect on all bounded moments of the average-case complexity.
Moreover, the techniques used in smoothed analysis are often quite different than those used in average-case analysis: the reason is that many average-case analyses exploit symmetry properties of their distributions that do no hold in smoothed analyses.
For a cautionary tale about discrete perturbations, check out the paper Smoothed Analysis of Three Combinatorial Problems (MFCS '03) by Cyril Banderier, Rene Beier , and Kurt Mehlhorn, and my commentary on this result.
For some elementary results in discrete smoothed analysis, see
the paper:
Smoothed Analysis: Motivation and Discrete Models.
Last modified: Sat Nov 8 13:35:43 EST 2003