Time complexity is a way to express, how long the algorithm will run. Typically it is not in absolute terms - it helps when size of inputs are growing. Eg. what happens if I put in twice as large input? Algorithms with O(n) will take twice as long; those with O(n2) will need 4 times as much time for doubled input. Complexity of BFS is O(n+e), where "n" is number of vertices and "e" is number of edges. In graphs, number of edges matters for complexity as there are graphs like trees where e is close to n, but there are also complete graphs, where e is close to n2. For BFS, you visit each node only limited number of times, so it is pretty efficient. Now if you had some algorithm, that would start BFS from every node, the time complexity would be O(n*(n+e)). Answer from Deleted User on reddit.com
🌐
Medium
anshika-bhargava0202.medium.com › revisiting-graph-algorithms-47b08f307255
Revisiting Graph Algorithms. Let us revise the time complexities of… | by Anshika Bhargava | Medium
July 17, 2022 - Time complexity : O(Elog(V)) Prim’s algorithm can have different time complexities depending upon the way it is implemented. If we use Binary Heap as the minimum weighted edge data structure, the time complexity of the algorithm can be shown to be Elog(V), which works well for sparse graphs.
🌐
Reddit
reddit.com › r/learnmath › [graph theory] what is time complexity?
r/learnmath on Reddit: [Graph Theory] what is time complexity?
May 2, 2021 -

I learned every lecture so far but some of the topics such as BFS have a note such as: Time complexity: uses at most c(n + e) steps (using the adjacency list)

But what tf does it even mean? Where can I start learning what it means?

Top answer
1 of 3
3
Time complexity is a way to express, how long the algorithm will run. Typically it is not in absolute terms - it helps when size of inputs are growing. Eg. what happens if I put in twice as large input? Algorithms with O(n) will take twice as long; those with O(n2) will need 4 times as much time for doubled input. Complexity of BFS is O(n+e), where "n" is number of vertices and "e" is number of edges. In graphs, number of edges matters for complexity as there are graphs like trees where e is close to n, but there are also complete graphs, where e is close to n2. For BFS, you visit each node only limited number of times, so it is pretty efficient. Now if you had some algorithm, that would start BFS from every node, the time complexity would be O(n*(n+e)).
2 of 3
1
Imagine we want to compare two algorithms' performance. Our first thought would be to run them both and see which one finishes first - think drag racing algorithms. However, this is ineffective for 2 reasons: Algorithms may have better or worse runtime depending on the size of input, and this performance depreciation doesn't always scale linearly. Different computers have different computational power, and therefore perform differently when running the same algorithm. The next thought might be to count the number of executions needed to process an input with a given algorithm, but this still suffers from issue #1, as well as we simply can't know how many steps the algorithm will take without running it See: Halting Problem . Time complexity is a notation computer scientists use to benchmark algorithms against one another in a way which is agnostic to a computer's individual performance and to the size of the input. Common notation is as follows: O() [Big Oh Notation] - The runtime of the algorithm in the worst case scenario. Ω() [Big Omega Notation] - Runtime in the best case scenario θ() [Big Theta Notation] - Runtime in the average case. (n) often represents the input space. Let's think about searching a list for a value as our example. If we were to iterate over the list front to back, looking for the value, our best case scenario would be that the value is at the front of the list, therefore our big-omega runtime would be Ω(1), otherwise known as constant runtime. However, what we often care more about when analyzing algorithms is how they perform in the worst case - since that tends to be closer to the general case (not always true - see: quicksort). In this case, the worse case scenario is that our desired value is at the end of the list, meaning our linear search has a big-oh runtime of O(n), where n is the length of the list. This is because we have to iterate over the entire list to get to the end. This is known as linear runtime. Big-oh and big-omega define upper and lower bounds for the behavior of our algorithm. If we were to plot runtime on a graph, big-oh and big-omega would literally be two lines, big-oh being on the top, and big-omega on the bottom, which the runtime would never exceed. Other common runtimes include quadratic O(n2), polynomial O(nx), where x is any positive integer, exponential O(xn), and factorial O(!n). Exponential and factorial algorithms are often considered intractable - they can't be executed in reasonable time on large datasets. Here's a good introductory video to time complexity - especially good for visual learners: https://youtu.be/kgBjXUE_Nwc If you're familiar with data structures and looking for an undergraduate level introduction to algorithm complexity analysis, Algorithms by Jeff Erickson provides a free, easy to read introduction (if not a little dense in parts.) Introduction to Algorithms by Cormen, Lieserson, Rivest, and Stein (commonly referred to as CLRS - Algorithms or just "the Bible of CS") is what I'd consider something more sutible for a graduate level algorithms course, but it's absolutely understandable for a dedicated undergrad CS student. My undergrad CS Algorithms course supplemented Erickson's book with CLRS. CLRS is a must read for anyone working in algorithms, academia, or CS research IMO. I'm personally a fan of Abdul Bari for anything algorithms related. I haven't watched it personally, but here's his video about time complexity
Discussions

algorithms - Time Complexity and graphs - Computer Science Stack Exchange
I'm learning graphs these days and need to clear few doubts- Can I determine weather 5 points in two dimensions whose X and Y coordinates are given lie on the same straight line in O(1). What is the More on cs.stackexchange.com
🌐 cs.stackexchange.com
December 18, 2018
Time complexity of graphs - Stack Overflow
In more complex graphs, you will find more edges than vertices. On the other extreme, a fully connected graph has O(|V|^2) edges. For such graphs, you are comparing an O(|V|^4) algorithm with an O(|V|^5) algorithm. More on stackoverflow.com
🌐 stackoverflow.com
dijkstra - On what does the time complexity for graph algorithms depends on? - Stack Overflow
I stumbled over this question in my textbook: "In general, on what does the time complexity of Prim's, Kruskal's and Dijkstra's algorithms depends on?" a. The number of vertices in the g... More on stackoverflow.com
🌐 stackoverflow.com
algorithm - Breadth First Search time complexity analysis - Stack Overflow
The time complexity to go over each adjacent edge of a vertex is, say, O(N), where N is number of adjacent edges. So, for V numbers of vertices the time complexity becomes O(V*N) = O(E), where E is the total number of edges in the graph. More on stackoverflow.com
🌐 stackoverflow.com
People also ask

What are dynamic graph algorithms?
Dynamic graph algorithms are designed to efficiently update and maintain solutions as the graph changes, such as adding or removing vertices or edges. Examples include dynamic shortest path algorithms and dynamic connectivity algorithms.
🌐
wscubetech.com
wscubetech.com › resources › dsa › graph-algorithms
Graph Algorithms in Data Structure (Time Complexity & Techniques)
Why are graph algorithms important in computer science?
Graph algorithms are fundamental in computer science because they model and solve a wide range of problems, from network routing and web crawling to social network analysis and scheduling tasks.
🌐
wscubetech.com
wscubetech.com › resources › dsa › graph-algorithms
Graph Algorithms in Data Structure (Time Complexity & Techniques)
When should I use Dijkstra’s Algorithm?
Dijkstra’s Algorithm is used to find the shortest path in weighted graphs with non-negative edge weights. It’s ideal for applications like routing and navigation, where you need the most efficient path from one point to another.
🌐
wscubetech.com
wscubetech.com › resources › dsa › graph-algorithms
Graph Algorithms in Data Structure (Time Complexity & Techniques)
Time complexity is a way to express, how long the algorithm will run. Typically it is not in absolute terms - it helps when size of inputs are growing. Eg. what happens if I put in twice as large input? Algorithms with O(n) will take twice as long; those with O(n2) will need 4 times as much time for doubled input. Complexity of BFS is O(n+e), where "n" is number of vertices and "e" is number of edges. In graphs, number of edges matters for complexity as there are graphs like trees where e is close to n, but there are also complete graphs, where e is close to n2. For BFS, you visit each node only limited number of times, so it is pretty efficient. Now if you had some algorithm, that would start BFS from every node, the time complexity would be O(n*(n+e)). Answer from Deleted User on reddit.com
🌐
Wikipedia
en.wikipedia.org › wiki › Time_complexity
Time complexity - Wikipedia
2 days ago - In complexity theory, the unsolved P versus NP problem asks if all problems in NP have polynomial-time algorithms. All the best-known algorithms for NP-complete problems like 3SAT etc. take exponential time. Indeed, it is conjectured for many natural NP-complete problems that they do not have sub-exponential time algorithms. Here "sub-exponential time" is taken to mean the second definition presented below. (On the other hand, many graph problems represented in the natural way by adjacency matrices are solvable in subexponential time simply because the size of the input is the square of the number of vertices.)
🌐
TutorialsPoint
tutorialspoint.com › graph_theory › graph_theory_time_complexity.htm
Graph Theory - Time Complexity
Time complexity in graph theory measures how fast or slow an algorithm works when solving problems with graphs. It shows how the algorithm's performance changes as the graph grows in size, which is usually measured by the number of nodes (V) and
🌐
W3Schools
w3schools.com › dsa › dsa_timecomplexity_theory.php
DSA Time Complexity
The relationship between time and ... in a graph like this: When talking about "operations" here, "one operation" might take one or several CPU cycles, and it really is just a word helping us to abstract, so that we can understand what time complexity is, and so that we can find the time complexity for different algorithms...
Find elsewhere
🌐
freeCodeCamp
freecodecamp.org › news › big-o-cheat-sheet-time-complexity-chart
Big O Cheat Sheet – Time Complexity Chart
November 7, 2024 - In plain terms, the algorithm will run input + 2 times, where input can be any number. This shows that it's expressed in terms of the input. In other words, it is a function of the input size. In Big O, there are six major types of complexities (time and space): ... Before we look at examples for each time complexity, let's understand the Big O time complexity chart. The Big O chart, also known as the Big O graph...
🌐
YourBasic
yourbasic.org › algorithms › graph
Introduction to graph algorithms: definitions and examples · YourBasic
The algorithm makes two calls to DFS for each edge {u, v} in E': one time when the algorithm visits the neighbors of u, and one time when it visits the neighbors of v. Hence, the time complexity of the algorithm is Θ(|V| + |E'|).
🌐
Thesciencebrigade
thesciencebrigade.com › jst › article › view › 452
Time Complexity Analysis of Graph Algorithms in Big Data: Evaluating the Performance of PageRank and Shortest Path Algorithms for Large-Scale Networks | Journal of Science & Technology
August 23, 2024 - This paper delves into the time complexity analysis of two prominent graph algorithms, PageRank and shortest path algorithms, with a focus on their performance in large-scale networks commonly encountered in big data systems.
🌐
WsCube Tech
wscubetech.com › resources › dsa › graph-algorithms
Graph Algorithms in Data Structure (Time Complexity & Techniques)
February 14, 2026 - Understand all graph algorithms in data structures, from basics to advanced techniques, enhancing your understanding of connectivity in this detailed tutorial.
🌐
Stack Overflow
stackoverflow.com › questions › 11890695 › on-what-does-the-time-complexity-for-graph-algorithms-depends-on
dijkstra - On what does the time complexity for graph algorithms depends on? - Stack Overflow
Additionally, different time complexities are possible through different implementations of the three algorithms, and analyzing each algorithm requires a consideration of both E and V. For example, Prim’s algorithm is O(V^2), but can be improved with the use of a min heap-based priority queue to achieve the complexity you found: O(ElogV). O(ElogV) may seem like the faster algorithm, but that’s not always the case. E can be as large as V^2, so in dense graphs with close to V^2 edges, O(ElogV) becomes O(V^2).
🌐
GeeksforGeeks
geeksforgeeks.org › dsa › understanding-time-complexity-simple-examples
Time Complexity with Simple Examples - GeeksforGeeks
Time Complexity: O(n*m) The program iterates through all the elements in the 2D array using two nested loops. The outer loop iterates n times and the inner loop iterates m times for each iteration of the outer loop.
Published   February 26, 2026
🌐
Medium
ajakcyer97.medium.com › big-o-time-complexity-graph-simplified-798f3b67877a
Big O — Time Complexity Graph Simplified | by Ajak Cyer | Medium
March 16, 2021 - In the graph above, the X-axis represents the input value as it gets longer and the Y-axis represents the amount of time elapsed to perform the algorithm (function). Functions with an O notation of O(1) will complete the fastest because it doesn’t matter how large the input value is, the same amount of time will elapse to complete the function. This is constant time complexity.
🌐
arXiv
arxiv.org › abs › 2501.09144
[2501.09144] Rule-Based Graph Programs Matching the Time Complexity of Imperative Algorithms
January 27, 2026 - The first two programs run in linear ... input graphs, matching the time complexity of imperative implementations of the Bellman-Ford algorithm....
Top answer
1 of 9
147

I hope this is helpful to anybody having trouble understanding computational time complexity for Breadth First Search a.k.a BFS.

Queue graphTraversal.add(firstVertex);
// This while loop will run V times, where V is total number of vertices in graph.
while(graphTraversal.isEmpty == false)

    currentVertex = graphTraversal.getVertex();

    // This while loop will run Eaj times, where Eaj is number of adjacent edges to current vertex.
    while(currentVertex.hasAdjacentVertices)
        graphTraversal.add(adjacentVertex);

    graphTraversal.remove(currentVertex);

Time complexity is as follows:

V * (O(1) + O(Eaj) + O(1))
V + V * Eaj + V
2V + E(total number of edges in graph)
V + E

I have tried to simplify the code and complexity computation but still if you have any questions let me know.

2 of 9
67

Considering the following Graph we see how the time complexity is O(|V|+|E|) but not O(V*E).

Adjacency List

V     E
v0:{v1,v2} 
v1:{v3}
v2:{v3}
v3:{}

Operating How BFS Works Step by Step

Step1:

Adjacency lists:

V     E
v0: {v1,v2} mark, enqueue v0
v1: {v3}
v2: {v3}
v3: {}

Step2:

Adjacency lists:

V     E
v0: {v1,v2} dequeue v0;mark, enqueue v1,v2
v1: {v3}
v2: {v3}
v3: {}

Step3:

Adjacency lists:

V     E
v0: {v1,v2}
v1: {v3} dequeue v1; mark,enqueue v3
v2: {v3}
v3: {}

Step4:

Adjacency lists:

V     E
v0: {v1,v2}
v1: {v3}
v2: {v3} dequeue v2, check its adjacency list (v3 already marked)
v3: {}

Step5:

Adjacency lists:

V     E
v0: {v1,v2}
v1: {v3}
v2: {v3}
v3: {} dequeue v3; check its adjacency list

Step6:

Adjacency lists:

V     E
v0: {v1,v2} |E0|=2
v1: {v3}    |E1|=1
v2: {v3}    |E2|=1
v3: {}      |E3|=0

Total number of steps:

|V| + |E0| + |E1| + |E2| +|E3| == |V|+|E|
 4  +  2   +  1   +   1  + 0   ==  4 + 4
                           8   ==  8

Assume an adjacency list representation, V is the number of vertices, E the number of edges.

Each vertex is enqueued and dequeued at most once.

Scanning for all adjacent vertices takes O(|E|) time, since sum of lengths of adjacency lists is |E|.

Hence The Time Complexity of BFS Gives a O(|V|+|E|) time complexity.

🌐
Baeldung
baeldung.com › home › graph theory › graphs › time and space complexity of adjacency matrix and list
Time and Space Complexity of Adjacency Matrix and List | Baeldung on Computer Science
March 18, 2024 - In a complete graph with vertices, for every vertex the element of would contain element, as every vertex is connected with every other vertex in such a graph. Therefore, the time complexity checking the presence of an edge in the adjacency list is . Let’s assume that an algorithm often requires checking the presence of an arbitrary edge in a graph.
🌐
Desmos
desmos.com › calculator › cvytuiwjja
Algorithmic Time Complexity | Desmos
Explore math with our beautiful, free online graphing calculator. Graph functions, plot points, visualize algebraic equations, add sliders, animate graphs, and more.