Time complexity is a way to express, how long the algorithm will run. Typically it is not in absolute terms - it helps when size of inputs are growing. Eg. what happens if I put in twice as large input? Algorithms with O(n) will take twice as long; those with O(n2) will need 4 times as much time for doubled input. Complexity of BFS is O(n+e), where "n" is number of vertices and "e" is number of edges. In graphs, number of edges matters for complexity as there are graphs like trees where e is close to n, but there are also complete graphs, where e is close to n2. For BFS, you visit each node only limited number of times, so it is pretty efficient. Now if you had some algorithm, that would start BFS from every node, the time complexity would be O(n*(n+e)). Answer from Deleted User on reddit.com
๐ŸŒ
Big-O Cheat Sheet
bigocheatsheet.com
Big-O Algorithm Complexity Cheat Sheet (Know Thy Complexities!) @ericdrowell
Hi there! This webpage covers the space and time Big-O complexities of common algorithms used in Computer Science.
๐ŸŒ
freeCodeCamp
freecodecamp.org โ€บ news โ€บ big-o-cheat-sheet-time-complexity-chart
Big O Cheat Sheet โ€“ Time Complexity Chart
November 7, 2024 - Before we look at examples for ... chart. The Big O chart, also known as the Big O graph, is an asymptotic notation used to express the complexity of an algorithm or its performance as a function of input size....
Discussions

[Graph Theory] what is time complexity?
Time complexity is a way to express, how long the algorithm will run. Typically it is not in absolute terms - it helps when size of inputs are growing. Eg. what happens if I put in twice as large input? Algorithms with O(n) will take twice as long; those with O(n2) will need 4 times as much time for doubled input. Complexity of BFS is O(n+e), where "n" is number of vertices and "e" is number of edges. In graphs, number of edges matters for complexity as there are graphs like trees where e is close to n, but there are also complete graphs, where e is close to n2. For BFS, you visit each node only limited number of times, so it is pretty efficient. Now if you had some algorithm, that would start BFS from every node, the time complexity would be O(n*(n+e)). More on reddit.com
๐ŸŒ r/learnmath
8
2
May 2, 2021
algorithms - A* graph search time-complexity - Computer Science Stack Exchange
Some confusion about time-complexity and A*. According to A* Wiki the time-complexity is exponential in the depth of the solution (shortest path): The time complexity of A* depends on the heuri... More on cs.stackexchange.com
๐ŸŒ cs.stackexchange.com
April 19, 2016
Time complexity of graphs - Stack Overflow
I am looking at Time complexitys of graphs. What I dont understand is, why the time complexity O|V|^2|E|) is better than O(|E|^2|V|). In both one of them is ^2 why is one better than the other? More on stackoverflow.com
๐ŸŒ stackoverflow.com
performance - How to find time complexity of connected components graph algorith, - Stack Overflow
This leads to a time complexity of O(๐ธ + ๐‘‰log๐‘‰). The DFS algorithm you presented at the end is again an improvement. If we consider the graph to be undirected, then an edge will be traversed at most two times: once in each direction, and the second time no recursive call will follow, ... More on stackoverflow.com
๐ŸŒ stackoverflow.com
๐ŸŒ
Medium
anshika-bhargava0202.medium.com โ€บ revisiting-graph-algorithms-47b08f307255
Revisiting Graph Algorithms. Let us revise the time complexities ofโ€ฆ | by Anshika Bhargava | Medium
July 17, 2022 - Time complexity : O(V+E), as each vertex is visited once and all the edges are examined. Space complexity : O(V), to keep track of the visited vertices, and the queue to which vertices are pushed.
๐ŸŒ
Reddit
reddit.com โ€บ r/learnmath โ€บ [graph theory] what is time complexity?
r/learnmath on Reddit: [Graph Theory] what is time complexity?
May 2, 2021 -

I learned every lecture so far but some of the topics such as BFS have a note such as: Time complexity: uses at most c(n + e) steps (using the adjacency list)

But what tf does it even mean? Where can I start learning what it means?

Top answer
1 of 3
3
Time complexity is a way to express, how long the algorithm will run. Typically it is not in absolute terms - it helps when size of inputs are growing. Eg. what happens if I put in twice as large input? Algorithms with O(n) will take twice as long; those with O(n2) will need 4 times as much time for doubled input. Complexity of BFS is O(n+e), where "n" is number of vertices and "e" is number of edges. In graphs, number of edges matters for complexity as there are graphs like trees where e is close to n, but there are also complete graphs, where e is close to n2. For BFS, you visit each node only limited number of times, so it is pretty efficient. Now if you had some algorithm, that would start BFS from every node, the time complexity would be O(n*(n+e)).
2 of 3
1
Imagine we want to compare two algorithms' performance. Our first thought would be to run them both and see which one finishes first - think drag racing algorithms. However, this is ineffective for 2 reasons: Algorithms may have better or worse runtime depending on the size of input, and this performance depreciation doesn't always scale linearly. Different computers have different computational power, and therefore perform differently when running the same algorithm. The next thought might be to count the number of executions needed to process an input with a given algorithm, but this still suffers from issue #1, as well as we simply can't know how many steps the algorithm will take without running it See: Halting Problem . Time complexity is a notation computer scientists use to benchmark algorithms against one another in a way which is agnostic to a computer's individual performance and to the size of the input. Common notation is as follows: O() [Big Oh Notation] - The runtime of the algorithm in the worst case scenario. ฮฉ() [Big Omega Notation] - Runtime in the best case scenario ฮธ() [Big Theta Notation] - Runtime in the average case. (n) often represents the input space. Let's think about searching a list for a value as our example. If we were to iterate over the list front to back, looking for the value, our best case scenario would be that the value is at the front of the list, therefore our big-omega runtime would be ฮฉ(1), otherwise known as constant runtime. However, what we often care more about when analyzing algorithms is how they perform in the worst case - since that tends to be closer to the general case (not always true - see: quicksort). In this case, the worse case scenario is that our desired value is at the end of the list, meaning our linear search has a big-oh runtime of O(n), where n is the length of the list. This is because we have to iterate over the entire list to get to the end. This is known as linear runtime. Big-oh and big-omega define upper and lower bounds for the behavior of our algorithm. If we were to plot runtime on a graph, big-oh and big-omega would literally be two lines, big-oh being on the top, and big-omega on the bottom, which the runtime would never exceed. Other common runtimes include quadratic O(n2), polynomial O(nx), where x is any positive integer, exponential O(xn), and factorial O(!n). Exponential and factorial algorithms are often considered intractable - they can't be executed in reasonable time on large datasets. Here's a good introductory video to time complexity - especially good for visual learners: https://youtu.be/kgBjXUE_Nwc If you're familiar with data structures and looking for an undergraduate level introduction to algorithm complexity analysis, Algorithms by Jeff Erickson provides a free, easy to read introduction (if not a little dense in parts.) Introduction to Algorithms by Cormen, Lieserson, Rivest, and Stein (commonly referred to as CLRS - Algorithms or just "the Bible of CS") is what I'd consider something more sutible for a graduate level algorithms course, but it's absolutely understandable for a dedicated undergrad CS student. My undergrad CS Algorithms course supplemented Erickson's book with CLRS. CLRS is a must read for anyone working in algorithms, academia, or CS research IMO. I'm personally a fan of Abdul Bari for anything algorithms related. I haven't watched it personally, but here's his video about time complexity
๐ŸŒ
Wikipedia
en.wikipedia.org โ€บ wiki โ€บ Time_complexity
Time complexity - Wikipedia
January 18, 2026 - In complexity theory, the unsolved P versus NP problem asks if all problems in NP have polynomial-time algorithms. All the best-known algorithms for NP-complete problems like 3SAT etc. take exponential time. Indeed, it is conjectured for many natural NP-complete problems that they do not have sub-exponential time algorithms. Here "sub-exponential time" is taken to mean the second definition presented below. (On the other hand, many graph problems represented in the natural way by adjacency matrices are solvable in subexponential time simply because the size of the input is the square of the number of vertices.)
Find elsewhere
๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ dsa โ€บ analysis-algorithms-big-o-analysis
Big O Notation - GeeksforGeeks
Big O notation is used to describe the time or space complexity of algorithms.
Published ย  1 month ago
๐ŸŒ
W3Schools
w3schools.com โ€บ dsa โ€บ dsa_timecomplexity_theory.php
DSA Time Complexity
The relationship between time and the number of values in the array is linear, and can be displayed in a graph like this: When talking about "operations" here, "one operation" might take one or several CPU cycles, and it really is just a word helping us to abstract, so that we can understand what time complexity is, and so that we can find the time complexity for different algorithms.
๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ dsa โ€บ why-is-the-complexity-of-both-bfs-and-dfs-ove
Why is the complexity of both BFS and DFS O(V+E)? - GeeksforGeeks
July 23, 2025 - The time complexity of BFS and DFS is O(V+E) because it need to visit and examine every vertex and edge in the graph.
Top answer
1 of 2
18

These are basically two different perspectives or two different ways of viewing the running time. Both are valid (neither is incorrect), but is arguably more useful in the settings that typically arise in AI.

In the algorithms community and CS theory community, folks there tend to like to count the running time as a function of the number of vertices and edges in the graph. Why? In that context, worst-case running time is what makes most sense; also, in the problems that are typically considered in that community, in the worst case we need to examine the entire graph, so you typically can't hope to do better than .

In the AI community, folks tend to count the running time differently. They often consider a specific kind of graph: a tree with branching factor . Also, in the situations that arise there, the graph is often infinite or very large. Typically we try hard to avoid examining all of the graph -- that's often one of the major goals of the algorithms. Thus, counting complexity in terms of and doesn't make sense: may be infinite, and in any case, we don't plan on examining all of the graph, so all that matters is the number of vertices we actually visit, not the number that may exist elsewhere but that we don't care about.

So, for the situations that often arise in the AI community, it's often more meaningful to measure the running time in terms of the branching factor of the tree () and the depth of the goal node (). Typically, once we find the goal node, the algorithm stops. In such a tree, if we examine every vertex at depth before we find the goal node, we'll end up visiting vertices before we stop. Thus, if you like, you could think of this as visiting a subset of the graph with (where now includes only the vertices we visit) and ( includes only the edges we look at), and you could think of an -time algorithm as one whose running time is ... though this is a bit of an abuse of the notation. Anyway, hopefully you can see why is more informative than in this context.

2 of 2
2

It is common in the combinatorial search community to define search spaces implicitly, that is, as a set of states and transitions between them - as opposed to explicitly, that is, as concrete sets of vertices and edges. In implicit search spaces, states can be represented as vertices and transitions as edges, however, in many cases the practical set of states may not have finite bounds and therefore the number of edges and vertices cannot always be defined with finite cardinalities (either practically or theoretically). Thus for many applications it makes more sense to define performance in terms of the branching factor , as opposed to vertex and edge cardinalities.

๐ŸŒ
Python Morsels
pythonmorsels.com โ€บ time-complexities
Python Big O: the time complexities of different data structures in Python - Python Morsels
April 16, 2024 - Time complexity is usually discussed in terms of "Big O" notation. This is basically a way to discuss the order of magnitude for a given operation while ignoring the exact number of computations it needs. In "Big O" land, we don't care if something is twice as slow, but we do care whether it's n times slower where n is the length of our list/set/slice/etc. Here's a graph of the common time complexity curves:
๐ŸŒ
Stack Overflow
stackoverflow.com โ€บ questions โ€บ 76864671 โ€บ how-to-find-time-complexity-of-connected-components-graph-algorith
performance - How to find time complexity of connected components graph algorith, - Stack Overflow
This leads to a time complexity of O(๐ธ + ๐‘‰log๐‘‰). The DFS algorithm you presented at the end is again an improvement. If we consider the graph to be undirected, then an edge will be traversed at most two times: once in each direction, ...
๐ŸŒ
Medium
medium.com โ€บ yasser-dev โ€บ understanding-big-o-time-complexity-of-algorithm-in-7-mins-for-beginners-adbe5599d64c
Understanding Big O & Time complexity of Algorithm in 7 mins! โ€” For Beginners | by Yasser Shaikh | yasser.dev | Medium
September 4, 2022 - Big O notation is a metric used to measure how much time an algorithm takes to run a.k.a time complexity or how much memory an algorithm consumes when it runs a.ka.
๐ŸŒ
Thesciencebrigade
thesciencebrigade.com โ€บ jst โ€บ article โ€บ view โ€บ 452
Time Complexity Analysis of Graph Algorithms in Big Data: Evaluating the Performance of PageRank and Shortest Path Algorithms for Large-Scale Networks | Journal of Science & Technology
August 23, 2024 - PageRank, a foundational algorithm for ranking web pages, operates on the principle of recursively measuring the importance of nodes within a network based on their connectivity. The algorithmโ€™s time complexity is dependent on both the number of nodes and edges in the graph, as well as the convergence criterion used.
๐ŸŒ
WsCube Tech
wscubetech.com โ€บ resources โ€บ dsa โ€บ graph-algorithms
Graph Algorithms in Data Structure (Time Complexity & Techniques)
February 14, 2026 - Understand all graph algorithms in data structures, from basics to advanced techniques, enhancing your understanding of connectivity in this detailed tutorial.
๐ŸŒ
USACO
usaco.guide โ€บ home โ€บ bronze โ€บ time complexity
Time Complexity ยท USACO Guide
We can find the time complexity of multiple loops by multiplying together the time complexities of each loop.
๐ŸŒ
Desmos
desmos.com โ€บ calculator โ€บ cvytuiwjja
Algorithmic Time Complexity | Desmos
Explore math with our beautiful, free online graphing calculator. Graph functions, plot points, visualize algebraic equations, add sliders, animate graphs, and more.