O(N)
Example:
var words = ['spray', 'limit', 'elite', 'exuberant', 'destruction', 'present'];
const result = words.filter(word => isBig(word));
function isBig(word){
console.log("called");
return word.length > 6;
}
Output:
Answer from Saurabh Vaidya on Stack Overflow"called" "called" "called" "called" "called" "called" Array ["exuberant", "destruction", "present"]
O(N)
Example:
var words = ['spray', 'limit', 'elite', 'exuberant', 'destruction', 'present'];
const result = words.filter(word => isBig(word));
function isBig(word){
console.log("called");
return word.length > 6;
}
Output:
"called" "called" "called" "called" "called" "called" Array ["exuberant", "destruction", "present"]
This is from the MDN docs: "The callback function is called sequentially and at most once for each element in the array, and the return value of the callback function is used to determine the return value of the method."
Since the callback function is called no more than 1 time for each element in the array, the length of the array dictates the number of operations.
Therefore, the time complexity of the Array.protoype.filter method is O(n).
Keep in mind that while the time complexity of the filter method is O(n), it is not immune to the time complexity of the operations you add to the callback function. For example, if the callback function you provide to the filter method requires iterating through another array then your time complexity could change to quadratic because now you would have a nested loop. However, that really depends on what operations you run in your filter's callback function. If your filter simply contains a boolean check which is a single operation than the filter time complexity would be O(n * 2) or O(n + n), either way that simplifies to O(n).
JavaScript runtime complexity of Array functions - Stack Overflow
How do you find time complexity when using Javascript array methods
How to improve the performance of filtering an array that has 30k element?
javascript - What is the runtime complexity of this function? - Stack Overflow
The items.filter(() => { ... }) is a loop => O(n).
You have a for loop inside of it looping over the object keys => O(m * n).
The Object.keys() is O(m) in V8 and you have it twice in the for loop (in the condition so it's called in every iteration and in the loop body) so it's => O(m ^ 2 * n) (where m is the number of keys).
Also, you can use
for (let key in item) {
// and do whatever you want with the key
}
instead of using Object.keys.
Let me change your code a bit:
items.filter(item => Object.keys(item).forEach(key => console.log("item is", key)));
The lambda is executed for every item in items. The lambda iterates over every key in an item and prints it. The time complexity is therefore O(n*m) for n = number of items and m = number of keys per item. If the number of keys per item is fixed and relatively small, you can assume O(n). The big O notation is just an rough estimate about the runtime, constant factors aren't that interesting.
The ECMA specification does not specify a bounding complexity, however, you can derive one from the specification's algorithms.
push is O(1), however, in practice it will encounter an O(N) copy costs at engine defined boundaries as the slot array needs to be reallocated. These boundaries are typically logarithmic.
pop is O(1) with a similar caveat to push but the O(N) copy is rarely encountered as it is often folded into garbage collection (e.g. a copying collector could only copy the used part of an array).
shift is at worst O(N) however it can, in specially cases, be implemented as O(1) at the cost of slowing down indexing so your mileage may vary.
slice is O(N) where N is end - start. Not a tremendous amount of optimization opportunity here without significantly slowing down writes to both arrays.
splice is, worst case, O(N). There are array storage techniques that divide N by a constant but they significantly slow down indexing. If an engine uses such techniques you might notice unusually slow operations as it switches between storage techniques triggered by access pattern changes.
One you didn't mention, is sort. It is, in the average case, O(N log N). However, depending on the algorithm chosen by the engine, you could get O(N^2) in some cases. For example, if the engine uses QuickSort (even with an late out to InsertionSort), it has well-known N^2 cases. This could be a source of DoS for your application. If this is a concern either limit the size of the arrays you sort (maybe merging the sub-arrays) or bail-out to HeapSort.
in very simple words
push -> O(1)
pop -> O(1)
shift -> O(N)
slice -> O(N)
splice -> O(N)
Here is a complete explanation about time complexity of Arrays in JavaScript.
I would usually follow the rule of "look at the loops" but idk how all those functions are implemented under the hood
I have a function that performs intersection as follows
Array.from(new Set(arr.filter((i)=>arr2.includes(i)))) Or let p = new Set(arr) return [...p].filter((item)=>arr2.includes(item)))
I feel its linear time as by common sense if the arrays increase in length the time would also increase. But im not sure.
I'm using vuejs 2. I basically have an input box of which when I type in it, it gives me a drop down of the list of element I searched for, so say I searched for "foo" it will filter through the array of all items that has the name foo in them but if I search for "foo bar" it should give me the list of items that have the name "foo" or "bar" or "foo bar". This is business reasons.
The array of items has 30k records, this is the code I am using to do the filtering:
let keyWordsTyped = ["foo", "bar"]; // the searched keywords are placed in an array
items = items.filter((item) => { keyWordsTyped.some((keyword) => {
item.name.toLowerCase().includes(keyword)})})
.map(item => {highlightItem(item)});The thing is if I search for a single term like "foo" it works fine, the speed is fine. HOWEVER, if I search for "foo bar" it get way too slow and laggy, I guess this is because when I search for a single word the returned array is small, but if I search for two words it gets much bigger.
Is there any way to make this faster??
Yes, time complexity is O(n^2) - for example, if arr has 10 items, the algorithm needs to make ~100 comparisons before finishing.
Space complexity is O(n). For example, consider the last iteration of the outer .filter - the result that's almost finished being constructed takes up O(n) space at that time (worst-case; equivalent to the side of the input arr). The inner array inside the callback being filtered (which will then have its length checked, and returned) will also be, worst-case, the side of the input n. So, the most space that will be being currently used at any point in time is O(2n), which is equivalent to O(n).
I think you are right about the time complexity, it is O(n^2) due to the two nested loops.
IMO space complexity is O(n) as you need only n units of space to keep the array and no additional memory is allocated.