Stream.of(array).parallel() 

or

Arrays.stream(array).parallel()
Answer from Eugene on Stack Overflow
Top answer
1 of 2
56
Stream.of(array).parallel() 

or

Arrays.stream(array).parallel()
2 of 2
10

TLDR;

Any sequential Stream can be converted into a parallel one by calling .parallel() on it. So all you need is:

  1. Create a stream
  2. Invoke method parallel() on it.

Long answer

The question is pretty old, but I believe some additional explanation will make the things much clearer.

All implementations of Java streams implement interface BaseStream. Which as per JavaDoc is:

Base interface for streams, which are sequences of elements supporting sequential and parallel aggregate operations.

From API's point of view there is no difference between sequential and parallel streams. They share the same aggregate operations.

In order do distinguish between sequential and parallel streams the aggregate methods call BaseStream::isParallel method.

Let's explore the implementation of isParallel method in AbstractPipeline:

@Override
public final boolean isParallel() {
    return sourceStage.parallel;
}

As you see, the only thing isParallel does is checking the boolean flag in source stage:

/**
 * True if pipeline is parallel, otherwise the pipeline is sequential; only
 * valid for the source stage.
 */
private boolean parallel; 

So what does the parallel() method do then? How does it turn a sequential stream into a parallel one?

@Override
@SuppressWarnings("unchecked")
public final S parallel() {
    sourceStage.parallel = true;
    return (S) this;
}

Well it only sets the parallel flag to true. That's all it does.

As you can see, in current implementation of Java Stream API it doesn't matter how you create a stream (or receive it as a method parameter). You can always turn a stream into a parallel one with zero cost.

🌐
Oracle
docs.oracle.com › javase › tutorial › collections › streams › parallelism.html
Parallelism (The Java™ Tutorials > Collections > Aggregate Operations)
Remember that collections are not thread-safe. This means that multiple threads should not access a particular collection at the same time. Suppose that you do not invoke the method synchronizedList when creating parallelStorage: List<Integer> parallelStorage = new ArrayList<>();
Discussions

A surprising pain point regarding Parallel Java Streams (featuring mailing list discussion with Viktor Klang).
I did want to follow up about one point Viktor made later on in the conversation. https://mail.openjdk.org/pipermail/core-libs-dev/2024-November/134542.html And here is the quote. In a potential future where all intermediate operations are Gatherer-based, and all terminal operations are Collector-based, it would just work as expected. But with that said, I'm not sure it is practically achievable because some operations might not have the same performance-characteristics as before. Me personally, I would GLADLY accept a flag on stream (similar to parallel() or unordered()) that would allow me to guarantee that my stream never pre-fetches, even if I take a massive performance hit. If that can be accomplished by making all intermediate operations be implemented by a Gatherer under the hood, that is A-OK with me. The reality is, not all streams are compute bound. Some are IO bound, but are otherwise, a great fit for streams. Having a method that allows us to optimize for that fact is a new type of performance enhancement that I would greatly appreciate, even if it degrades performance in other ways. More on reddit.com
🌐 r/java
94
223
November 20, 2024
The Java Stream Parallel
The Streams API was a game changer for me. One of the best programming book I ever read was Modern Java in Action, almost exclusively about streams. The performance is incredible from my experience. Thanks for putting this together. I’ll be reading up. More on reddit.com
🌐 r/java
45
86
January 31, 2025
How fast are the Java 8 Streams compared to for-loops?
I see this come up a lot. And almost every time they overlook one very important fact. Unless you are doing something that is very critical, the performance of your loops doesn't really matter that much. What matters so much more is the performance of your developers. I.e. the ability for people to read, understand, produce, debug, etc the code that is being run. This example shows that it takes 5ms to reduce 500,000 integers. 5ms. That's 1ns per integer in the array. That's nothing to worry about at all. Compare that instead to the cost to the developer of reading int m = Integer.MIN_VALUE; for (int i = 0; i <= ints.length; i++) if (ints[i] > m) m = ints[i]; as opposed to reading int m = Arrays.stream(ints) .reduce(Integer.MIN_VALUE, Math::max); Which one is more readable? Which one is more likely to have subtle bugs that aren't obvious? Did you even notice that there's an off-by-one error in my example above? And that to save a runtime cost of 5 one-thousandths of one second. More on reddit.com
🌐 r/java
44
94
November 27, 2015
computing the sum of an array using parallel programming
The parameter n here is the size of the array you're passing, not the maximum index you want to access, hence the base case (you're exiting when you have a subarray size of 1). A+n/2 gets you a pointer to the start of the right half of the array. n-n/2 gets you the size of that right half. More on reddit.com
🌐 r/learnprogramming
4
1
November 27, 2021
🌐
Baeldung
baeldung.com › home › java › java streams › when to use a parallel stream in java
When to Use a Parallel Stream in Java | Baeldung
November 10, 2025 - We considered the overhead of managing ... the results. We saw that arrays are a great data source for parallel execution because they bring the best possible locality and can split cheaply and evenly....
🌐
Mkyong
mkyong.com › home › java8 › java 8 parallel streams examples
Java 8 Parallel Streams Examples - Mkyong.com
July 3, 2019 - P.S By default, parallel streams use `ForkJoinPool` 4.1 Java 8 streams to print all prime numbers up to 1 million: ... package com.mkyong.java8; import java.util.ArrayList; import java.util.List; import java.util.stream.IntStream; import java.util.stream.Stream; public class ParallelExample4 { public static void main(String[] args) { long count = Stream.iterate(0, n -> n + 1) .limit(1_000_000) //.parallel() with this 23s, without this 1m 10s .filter(ParallelExample4::isPrime) .peek(x -> System.out.format("%s\t", x)) .count(); System.out.println("\nTotal: " + count); } public static boolean isPrime(int number) { if (number <= 1) return false; return !IntStream.rangeClosed(2, number / 2).anyMatch(i -> number % i == 0); } }
🌐
Readthedocs
java-8-tips.readthedocs.io › en › stable › parallelization.html
13. Parallel Data Processing - Java 8 tutorial! - Read the Docs
Every time trySplit method checks ... method repeatedly on those 1000 (may be less) elements. ... Parallel stream make use of both ForkJoinPool and Spliterator to process elements parallelly....
🌐
GeeksforGeeks
geeksforgeeks.org › java › perform-parallel-processing-on-arrays-in-java-using-parallel-streams
Perform Parallel Processing on Arrays in Java Using Parallel Streams - GeeksforGeeks
February 7, 2024 - Let's look at an example of utilizing parallel streams to square each member in an array. ... // Java Program to perform parallel streams to square each member in an array import java.util.Arrays; public class ParallelArrayProcessing { public static void main(String[] args) { // Create an array of integers int[] numbers = {21, 22, 23, 24, 25, 26, 27, 28, 29, 30}; System.out.println("Initial Array: " + Arrays.toString(numbers)); // Perform parallel processing using parallel streams Arrays.parallelSetAll(numbers, i -> performOperation(numbers[i])); // Display the modified array after parallel processing System.out.println("Modified Array: " + Arrays.toString(numbers)); } // Example operation to be performed on each element of the array private static int performOperation(int value) { // In this example, let's square each element return value * value; } }
🌐
EDUCBA
educba.com › home › software development › software development tutorials › java tutorial › java parallel stream
Java Parallel Stream | How does Parallel Stream work in Java & Examples
March 28, 2023 - But when we applied the parallel stream, then we got output zig-zag manner means parallelly. ... import java.util.ArrayList; import java.util.List; public class ParallelStreamEvenNumbers { public static void main(String[] args) { System.out.println("Even Numbers before Parallel Stream"); // creating array list for adding alphabets List<Integer> evenNumbers = new ArrayList<Integer>(); for (int number=0;number<=10;number++) { // iterating numbers if(number%2==0) //if number even go inside the condition evenNumbers.add(number); //added all even numbers } // displaying initial even numbers evenNum
Address   Unit no. 202, Jay Antariksh Bldg, Makwana Road, Marol, Andheri (East),, 400059, Mumbai
Find elsewhere
🌐
GeeksforGeeks
geeksforgeeks.org › java › what-is-java-parallel-streams
What is Java Parallel Streams? - GeeksforGeeks
February 21, 2025 - Java Parallel Streams is a feature of Java 8 and higher, meant for utilizing multiple cores of the processor. Normally any Java code has one stream of processing, where it is executed sequentially.
🌐
Reddit
reddit.com › r/java › a surprising pain point regarding parallel java streams (featuring mailing list discussion with viktor klang).
r/java on Reddit: A surprising pain point regarding Parallel Java Streams (featuring mailing list discussion with Viktor Klang).
November 20, 2024 -

First off, apologies for being AWOL. Been (and still am) juggling a lot of emergencies, both work and personal.

My team was in crunch time to respond to a pretty ridiculous client ask. In order to get things in in time, we had to ignore performance, and kind of just took the "shoot first, look later" approach. We got surprisingly lucky, except in one instance where we were using Java Streams.

It was a seemingly simple task -- download a file, split into several files based on an attribute, and then upload those split files to a new location.

But there is one catch -- both the input and output files were larger than the amount of RAM and hard disk available on the machine. Or at least, I was told to operate on that assumption when developing a solution.

No problem, I thought. We can just grab the file in batches and write out the batches.

This worked out great, but the performance was not good enough for what we were doing. In my overworked and rushed mind, I thought it would be a good idea to just turn on parallelism for that stream. That way, we could run N times faster, according to the number of cores on that machine, right?

Before I go any further, this is (more or less) what the stream looked like.

try (final Stream<String> myStream = SomeClass.openStream(someLocation)) {
    myStream
        .parallel()
        //insert some intermediate operations here
        .gather(Gatherers.windowFixed(SOME_BATCH_SIZE))
        //insert some more intermediate operations here
        .forEach(SomeClass::upload)
        ;
}

So, running this sequentially, it worked just fine on both smaller and larger files, albeit, slower than we needed.

So I turned on parallelism, ran it on a smaller file, and the performance was excellent. Exactly what we wanted.

So then I tried running a larger file in parallel.

OutOfMemoryError

I thought, ok, maybe the batch size is too large. Dropped it down to 100k lines (which is tiny in our case).

OutOfMemoryError

Getting frustrated, I dropped my batch size down to 1 single, solitary line.

OutOfMemoryError

Losing my mind, I boiled down my stream to the absolute minimum possible functionality possible to eliminate any chance of outside interference. I ended up with the following stream.

final AtomicLong rowCounter = new AtomicLong();
myStream
    .parallel()
    //no need to batch because I am literally processing this file each line at a time, albeit, in parallel.
    .forEach(eachLine -> {
        final long rowCount = rowCounter.getAndIncrement();
        if (rowCount % 1_000_000 == 0) { //This will log the 0 value, so I know when it starts.
            System.out.println(rowCount);
        }
    })
    ;

And to be clear, I specifically designed that if statement so that the 0 value would be printed out. I tested it on a small file, and it did exactly that, printing out 0, 1000000, 2000000, etc.

And it worked just fine on both small and large files when running sequentially. And it worked just fine on a small file in parallel too.

Then I tried a larger file in parallel.

OutOfMemoryError

And it didn't even print out the 0. Which means, it didn't even process ANY of the elements AT ALL. It just fetched so much data and then died without hitting any of the pipeline stages.

At this point, I was furious and panicking, so I just turned my original stream sequential and upped my batch size to a much larger number (but still within our RAM requirements). This ended up speeding up performance pretty well for us because we made fewer (but larger) uploads. Which is not surprising -- each upload has to go through that whole connection process, and thus, we are paying a tax for each upload we do.

Still, this just barely met our performance needs, and my boss told me to ship it.

Weeks later, when things finally calmed down enough that I could breathe, I went onto the mailing list to figure out what on earth was happening with my stream.

Here is the start of the mailing list discussion.

https://mail.openjdk.org/pipermail/core-libs-dev/2024-November/134508.html

As it turns out, when a stream turns parallel, the intermediate and terminal operations you do on that stream will decide the fetching behaviour the stream uses on the source.

In our case, that meant that, if MY parallel stream used the forEach terminal operation, then the stream decides that the smartest thing to do to speed up performance is to fetch the entire dataset ahead of time and store it into an internal buffer in RAM before doing ANY PROCESSING WHATSOEVER. Resulting in an OutOfMemoryError.

And to be fair, that is not stupid at all. It makes good sense from a performance stand point. But it makes things risky from a memory standpoint.

Anyways, this is a very sharp and painful corner about parallel streams that i did not know about, so I wanted to bring it up here in case it would be useful for folks. I intend to also make a StackOverflow post to explain this in better detail.

Finally, as a silver-lining, Viktor Klang let me know that, a .gather() immediately followed by a .collect(), is immune to this pre-fetching behaviour mentioned above. Therefore, I could just create a custom Collector that does what I was doing in my forEach(). Doing it that way, I could run things in parallel safely without any fear of the dreaded OutOfMemoryError.

(and tbh, forEach() wasn't really the best idea for that operation). You can read more about it in the mailing list link above.

Please let me know if there are any questions, comments, or concerns.

EDIT -- Some minor clarifications. There are 2 issues interleaved here that makes it difficult to track the error.

  1. Gatherers don't (currently) play well with some of the other terminal operations when running in parallel.

  2. Iterators are parallel-unfriendly when operatiing as a stream source.

When I tried to boil things down to the simplistic scenario in my code above, I was no longer afflicted by problem 1, but was now afflicted by problem 2. My stream source was the source of the problem in that completely boiled down scenario.

Now that said, that only makes this problem less likely to occur than it appears. The simple reality is, it worked when running sequentially, but failed when running in parallel. And the only way I could find out that my stream source was "bad" was by diving into all sorts of libraries that create my stream. It wasn't until then that I realized the danger I was in.

Top answer
1 of 5
40
I did want to follow up about one point Viktor made later on in the conversation. https://mail.openjdk.org/pipermail/core-libs-dev/2024-November/134542.html And here is the quote. In a potential future where all intermediate operations are Gatherer-based, and all terminal operations are Collector-based, it would just work as expected. But with that said, I'm not sure it is practically achievable because some operations might not have the same performance-characteristics as before. Me personally, I would GLADLY accept a flag on stream (similar to parallel() or unordered()) that would allow me to guarantee that my stream never pre-fetches, even if I take a massive performance hit. If that can be accomplished by making all intermediate operations be implemented by a Gatherer under the hood, that is A-OK with me. The reality is, not all streams are compute bound. Some are IO bound, but are otherwise, a great fit for streams. Having a method that allows us to optimize for that fact is a new type of performance enhancement that I would greatly appreciate, even if it degrades performance in other ways.
2 of 5
11
This was a fascinanting read. Thank you for sharing. I guess it is kinda bad when higher level non-trivial apis, like streams or fork-join, do not expose lower level oprations as user-overridable constructs. Like in this example an iteration strategy for streams, or underlying executor of fork-join pool. Seems like an obvious thing to have because nobody knows better how thing will be used than end user..
🌐
DevGlan
devglan.com › java8 › java-8-parallel-steams
Java 8 Parallel Streams Example | DevGlan
May 24, 2017 - Following is an example that processes large streams and compares the time difference between processing time of sequential stream and parallel stream. ParallelStreamPerformanceCheck.java · package com.devglan; import java.util.ArrayList; import java.util.List; /** * @author only2dhir * */ public class ParallelStreamPerformanceCheck { public static void main(String[] args) { // TODO Auto-generated method stub List&ltInteger&gt numList = new ArrayList&ltInteger&gt(); for (int i = 0; i < 1000; i++) { numList.add(i); } // Processing sequentially long startTime = System.currentTimeMillis(); numLi
🌐
GeeksforGeeks
geeksforgeeks.org › java › perform-parallel-processing-on-arrays-in-java-using-streams
How to Perform Parallel Processing on Arrays in Java Using Streams? - GeeksforGeeks
February 14, 2024 - It enables the simultaneous execution of actions on many threads when applied to a stream. Let's look at an example where we wish to do certain actions on each member of an integer array in parallel.
🌐
Medium
hkdemircan.medium.com › java8-concurrency-parallel-stream-88b4aef91a56
Java8 | Concurrency-Parallel Stream | by Hasan Kadir Demircan | Medium
April 7, 2023 - As we discussed concurrency in Java earlier, Now let’s look at that how we can use parallel stream in Java. ... Stream<Integer> stream = Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9).stream(); Stream<Integer> parallelStream = stream.parallel(); System.out.println(parallelStream.count());
🌐
GeeksforGeeks
geeksforgeeks.org › java › parallel-vs-sequential-stream-in-java
Parallel vs Sequential Stream in Java - GeeksforGeeks
July 15, 2025 - In this example the list.stream() works in sequence on a single thread with the print() operation and in the output of the preceding program, the content of the list is printed in an ordered sequence as this is a sequential stream. It is a very useful feature of Java to use parallel processing, even if the whole program may not be parallelized.
🌐
Medium
medium.com › @JavaFusion › parallel-streaming-in-java-8-2885373a576a
Parallel streaming in Java 8. In Java 8, parallel streaming allows… | by Java Fusion | Medium
February 2, 2024 - Here’s a simple example demonstrating parallel streaming in Java 8 · package com.streamapi; import java.util.Arrays; import java.util.List; public class ParallelStreamingDemo { public void pStream() { List<Integer> list = Arrays.asList(1, 2, 3, 4, 5); System.out.println("***** Example of normal stream ********"); list.stream() .map(num -> num * 5) .forEach(n -> System.out.println("Value ->" + n + " Thread name ->" + Thread.currentThread().getName())); System.out.println("***** Example of stream parallel() ********"); list.parallelStream() .map(num -> num * 5) .forEach(n -> System.out.printl
🌐
Dev.java
dev.java › learn › parallelizing-streams
Parallelizing Streams
September 14, 2021 - Process your in-memory data with Java streams and collectors. Process them faster with parallel streams.
🌐
DEV Community
dev.to › igalhaddad › java-8-parallel-stream-with-threadpool-32kd
Java 8 Parallel Stream with ThreadPool - DEV Community
February 10, 2020 - public static void parallelSetAll(long[] array, IntToLongFunction generator) { Objects.requireNonNull(generator); IntStream.range(0, array.length).parallel().forEach(i -> { array[i] = generator.applyAsLong(i); }); } ... All three test methods did the same task of calculating a Fibonacci sequence in length of 1,000,000 numbers. ... Using Java 8 parallel streams with a custom ThreadPool can significantly enhance control over multithreading, especially when the default ForkJoinPool.commonPool() isn't suitable for your performance or debugging needs.
🌐
Medium
medium.com › @RupamThakre › java-parallel-streams-3f503ec4b3bc
Java : Parallel Streams. A Comprehensive Guide to Parallel… | by Rupam Thakre | Medium
February 23, 2025 - In this blog, we’ll dive deep into Parallel Streams in Java — what they are, how they work, and when to use them effectively.
🌐
Fast thread
blog.fastthread.io › home › parallel sort
Parallel Sort - Fast thread
August 31, 2025 - For example, java.util.List has a stream() method that will support parallelism. You can create a parallel stream by calling Stream.parallel() Listlist = new ArrayList<>(); list.add(“a”); …..
🌐
Bytes Lounge
byteslounge.com › tutorials › java-8-parallel-streams
Java 8 Parallel Streams
One should not expect the parallel stream to process the elements in the order they are defined in the original collection (although one may force the processing to be ordered as we will see later in this article). Let's perform a reduction in order to convert a Collection into a Map with a couple of entries: one containing a list of even numbers and the other containing a list of odd numbers: ... List<Integer> integerList = Arrays.asList(new Integer[] { 1, 2, 3, 4, 5, 6, 7, 8, 9 }); Map<Integer, List<Integer>> evenOddMap = integerList .stream().collect(Collectors.groupingBy(i -> i % 2 == 0 ?