You are adding them as Integers and java has limit for Integer value 2147483647.
If you pass this max value you will start to count from minimum value
Integer.MAX_VALUE + 1 = -2147483648.
You are adding them as Integers and java has limit for Integer value 2147483647.
If you pass this max value you will start to count from minimum value
Integer.MAX_VALUE + 1 = -2147483648.
You are performing an Integer addition with your current code. My assumption is that you are aware of the numeric overflow that could occur and that's the reason for long type of result. Things would be in place once you map the values of the stream to Long before summing them.
long result = numbers.stream()
.map(Long::valueOf)
.reduce(0L, Long::sum);
Or simply put:
long result = numbers.stream()
.mapToLong(val -> val)
.sum();
No, the stream pipeline doesn't allow to actually skip around any part of the pipeline, so you're forced to work with either conditional logic inside the steps and including the limit() always in the pipeline, or building the stream in parts which would be a bit more legible (IMHO) than the if/else in the question
Stream<Item> s = items.stream()
.map(this::myMapper)
.filter(Objects::nonNull);
if(maxNumber > 0) {
s = s.limit(maxNumber);
}
List<Item> l = s.collect(Collectors.toList());
In a simple case like here it doesn't make much difference, but you often see in regular code collections being passed through methods, converted to streams and then back to collections. In such cases it might be a better idea to work with streams in parts until you really need to collect().
I suppose that
.limit(maxNumber == 0 ? Long.MAX_VALUE : maxNumber)
will do the trick, as it is highly non probable that you are going to tackle a stream with more than 2^63-1 elements...
At least be careful with parallel streams on this... A note in API docs says:
API Note: While
limit()is generally a cheap operation on sequential stream pipelines, it can be quite expensive on ordered parallel pipelines, especially for large values of maxSize...
Since there are only 4 elements that pass the filter, limit(10) never reaches 10 elements, so the Stream pipeline keeps generating new elements and feeding them to the filter, trying to reach 10 elements that pass the filter, but since only the first 4 elements pass the filter, the processing never ends (at least until i overflows).
The Stream pipeline is not smart enough to know that no more elements can pass the filter, so it keeps processing new elements.
Flipping the limit and the filter clauses has different behaviors.
If you put the limit first, the stream will first generate 10 integers [1..10], and then filter them leaving only those smaller than 5.
In the original ordering, with the filter applied first, integers are generated and filtered until you reach 10 elements. This isn't an infinite operator, as i in the supplier will eventually overflow, but it will take a while, especially on a slow computer, to reach MAX_INT.