Generate all the permutations of a list
You have n! lists, so you cannot achieve better efficiency than O(n!).
Generate all the permutations of a list
You have n! lists, so you cannot achieve better efficiency than O(n!).
Videos
If you take multiplication as O(1), then yes, O(N) is correct. However, note that multiplying two numbers of arbitrary length x is not O(1) on finite hardware -- as x tends to infinity, the time needed for multiplication grows (e.g. if you use Karatsuba multiplication, it's O(x ** 1.585)).
You can theoretically do better for sufficiently huge numbers with Schönhage-Strassen, but I confess I have no real world experience with that one. x, the "length" or "number of digits" (in whatever base, doesn't matter for big-O anyway of N, grows with O(log N), of course.
If you mean to limit your question to factorials of numbers short enough to be multiplied in O(1), then there's no way N can "tend to infinity" and therefore big-O notation is inappropriate.
When you express the complexity of an algorithm, it is always as a function of the input size. It is only valid to assume that multiplication is an O(1) operation if the numbers that you are multiplying are of fixed size. For example, if you wanted to determine the complexity of an algorithm that computes matrix products, you might assume that the individual components of the matrices were of fixed size. Then it would be valid to assume that multiplication of two individual matrix components was O(1), and you would compute the complexity according to the number of entries in each matrix.
However, when you want to figure out the complexity of an algorithm to compute N! you have to assume that N can be arbitrarily large, so it is not valid to assume that multiplication is an O(1) operation.
If you want to multiply an n-bit number with an m-bit number the naive algorithm (the kind you do by hand) takes time O(mn), but there are faster algorithms.
If you want to analyze the complexity of the easy algorithm for computing N!
factorial(N)
f=1
for i = 2 to N
f=f*i
return f
then at the k-th step in the for loop, you are multiplying (k-1)! by k. The number of bits used to represent (k-1)! is O(k log k) and the number of bits used to represent k is O(log k). So the time required to multiply (k-1)! and k is O(k (log k)^2) (assuming you use the naive multiplication algorithm). Then the total amount of time taken by the algorithm is the sum of the time taken at each step:
sum k = 1 to N [k (log k)^2] <= (log N)^2 * (sum k = 1 to N [k]) =
O(N^2 (log N)^2)
You could improve this performance by using a faster multiplication algorithm, like Schönhage-Strassen which takes time O(n*log(n)*log(log(n))) for 2 n-bit numbers.
The other way to improve performance is to use a better algorithm to compute N!. The fastest one that I know of first computes the prime factorization of N! and then multiplies all the prime factors.
All possible paths in a graph: If all nodes are connected, you have 100 nodes, and the start node is given, you can move to one of 99 nodes. If you don't want to visit the same node twice, you can visit 98 nodes as the second node from each first node, total 99 x 98. Each path to the second node lets you take 97 paths to the third node, that's 99 x 98 x 97. Quite obviously factorial in the number of nodes.
The most important question is: How does the work required change if the problem size changes? If you are lucky, then you can restrict the growth, but if increasing the problem size by 1 increases the time by a constant factor c > 1 or more, then you have exponential growth.
In practice, just implement the algorithm and run it with growing problem size. Construct a "travelling salesman" problem with varying number of cities from 1 to 10000, set it up with random distances, and solve it. How fast does the execution time grow with the number of cities? What's the largest problem you can solve in a day?
I'm particularly confused when trying to calculate the time complexity of algorithms that explore all possible paths in a graph.
Consider a graph $G$ of $n$ vertices. Assuming all pairs of vertices are connected via an edge (i.e. $G$ is complete), then a path $v_1v_2\dots v_n$ can be any permutation of the $n$ vertices. Hence, there exist $n!$ possible paths in $G$ since we have $n$ choices to pick the first vertex, $n-1$ choices for the second, and so on.
An algorithm often exhibits a factorial time complexity when it iterates through all permutations of a certain set of elements (such as vertices in a graph). An algorithm often runs in $\mathcal{O}(2^n)$ when it iterates through all possible subsets of a set of elements (such as a solver to SUBSET-SUM).