Summary: Many problems occurring in practice are NP-hard. Two approaches have been considered as successful in dealing with NP-hard problems. One is the so-called parameterized complexity theory. Hereby, the structural hardness of a problem is then expressed through an additional parameter. The other one follows a probabilistic way of analyzing problems and algorithms. Here, the average-case performance of algorithms for a hard problem is investigated on instances generated randomly according to diverse statistic distributions. Although the concept of randomness has been applied for the design of parameterized algorithms, the average-analysis of parameterized problems and algorithms seems being neglected. We first aim to analyze the average-case behavior of some known parameterized algorithms, mainly motivated by the recent encouraging experimental studies of these algorithms. Most of these studies record, compared to the worst-case analysis, a much more promising performance on real-world data as well as some random data sets. Moreover, we will consider some classic parameterized problems such as Clique, which unlikely admit parameterized algorithms by the worst-case complexity analysis, under certain practically relevant distributions. The goal here is to show that some of these worst-case intractable problems are efficiently solvable in the average-case case.