How to map elements to their index using streams?

67
August 24, 2019, at 10:10 PM

I get a stream of some custom objects and I would like to create a map Map<Integer, MyObject> with index of each object as key. To give you a simple example:

Stream<String> myStream = Arrays.asList("one","two","three").stream();
Integer i = 0;
Map<Integer, String> result3 = myStream.collect(Collectors.toMap(x -> i++, x -> x));

Obviously, this doesn't compile because:

local variables referenced from a lambda expression must be final or effectively final

Is there a simple way to map elemnts of a stream to their indices so that the expected output for above example is something like:

{1=one, 2=two, 3=three}
Answer 1

You i variable is not effectively final.

You can use AtomicInteger as Integer wrapper:

Stream<String> myStream = Arrays.asList("one","two","three").stream();
AtomicInteger atomicInteger = new AtomicInteger(0);
Map<Integer, String> result3 = myStream.collect(Collectors.toMap(x -> atomicInteger.getAndIncrement(), Function.identity()));

I consider it a bit hacky because it only solves the problem of effectively final variable. Since it is a special ThreadSafe version it might introduce some overhead. Pure stream solution in the answer by Samuel Philipp might better fit your needs.

Answer 2

You can use an IntStream` to solve this:

List<String> list = Arrays.asList("one","two","three");
Map<Integer, String> map = IntStream.range(0, list.size()).boxed()
        .collect(Collectors.toMap(Function.identity(), list::get));

You create an IntStream from 0 to list.size() - 1 (IntStream.range() excludes the last value from the stream) and map each index to the value in your list. The advantage of this solution is, that it will also work with parallel streams, which is not possible with the use of an AtomicInteger.

So the result in this case would be:

{0=one, 1=two, 2=three}

To start the first index at 1 you can just add 1 during collect:

List<String> list = Arrays.asList("one", "two", "three");
Map<Integer, String> map = IntStream.range(0, list.size()).boxed()
        .collect(Collectors.toMap(i -> i + 1, list::get));

This will result in this:

{1=one, 2=two, 3=three}
Answer 3

Try this :

Lets say String[] array = { "V","I","N","A","Y" }; then,

Arrays.stream(array) 
        .map(ele-> index.getAndIncrement() + " -> " + ele) 
        .forEach(System.out::println); 

Output :

0 -> V
1 -> I
2 -> N
3 -> A
4 -> Y
Answer 4

We can use the List.indexOf(Object o) method to get the index of the element in the list, while constructing the Map:

 List<String> list = Arrays.asList("one","two","three");
 Map<Integer, String> result = list.stream()
                                   .collect(Collectors.toMap(
                                    k -> list.indexOf(k) + 1, 
                                    Function.identity(),
                                    (v1, v2) -> v2));
 System.out.println(result);

If there are duplicates in the list, the index of the first occurence of the element will be added in the final map. Also to resolve the merge error in the map during a key collision we need to ensure that there is a merge function supllied to the toMap(keyMapper, valueMapper, mergeFunction)

Output:

{1=one, 2=two, 3=three}
READ ALSO
Error creating bean with name &#39;getSessionFactory&#39; defined in com.shivam.spring.config.AppConfig:

Error creating bean with name 'getSessionFactory' defined in com.shivam.spring.config.AppConfig:

This error comes when i start my server Error creating bean with name 'getSessionFactory' defined in comshivam

55
Getting error while executing java programs of packages from command line

Getting error while executing java programs of packages from command line

I'm executing the programs from command line and using packages in themmy program file names are TestA

67
Connection Pooling is not working for DB2 database through UCP

Connection Pooling is not working for DB2 database through UCP

We are writing a simple standalone java batchWe are using DB2 database

87
Which are the advantages of Singleton EJB and Asynchrnous methods against single execution timer service?

Which are the advantages of Singleton EJB and Asynchrnous methods against single execution timer service?

I am writing this post because I am doing an huge refactoring of some old codeIn our Java EE product, running under Wildfly 10 with EJB 3

61