Master the map, filter, and reduce functions

Higher-order functions are functions that take a function as a parameter and/or return a function as an output.
A few useful higher-order functions are map()
, filter()
, and reduce()
. map()
and filter()
are built-in functions, whereas reduce()
is contained in functools()
module.
Let’s learn about map()
, filter()
, and reduce()
in this article.
map()
map()
is used to apply a function to each item in the iterable at the same time.map()
will return a map object that is an iterator:
map(function,iterable…)
Calculating the square of all numbers in the iterable:
https://gist.github.com/BetterProgramming/1cca99e8487a2c10ad744c3dff3c58d0#file-squaremap-py
square=map(lambda x:x*2, num)
: We can pass thelambda
function oruser_defined
function. Thelambda
function’s syntax islambda parameters : expression
.map()
returns a map object that is an iterator. We can convert the iterator (map object) to an iterable likelist
using a list constructor (print(list(square))
).
map vs. for loop
Solving the same problem (square of numbers) using a for
loop:
num=[1,2,3,4,5]
square=[]
for i in num:
square.append(i*2)
print (square)
- We have to iterate through the iterable (
list
) using afor
loop and append the result in a new list. - Sequential traversal: We have to iterate through an iterable (length n) n times.
map vs. list comprehension
Solving the same problem (square of numbers) using list comprehension:
num=[1,2,3,4,5]
result=[i**2 for i in num]
print (result)
#Output:[1, 4, 9, 16, 25]
List comprehension will return a list and not an iterator, whereas map()
will return a map object that is an iterator.
We know that iterators can create on-demand sequences requiring little memory vs. lists that require more memory to store data. If you are working with large data, list comprehensions are not useful. map()
can be used to return a map object that is an iterator and computes the values as necessary, not needing to materialize all the values at once.
map vs. generator expression
A generator expression is used when we work with a large amount of data when compared to list comprehension. A generator expression returns an iterator.
The performance improvement from the use of generators is the result of the lazy (on-demand) generation of values, which translates to lower memory usage.
Calculating the square of all numbers in the iterable:
https://gist.github.com/BetterProgramming/bb8ec8e4d4b48cfb411009dcc9027fb3#file-squareall-py
When should we prefer map() over a generator expression?
Both map()
and generator expressions only return an iterator.
map
syntax:map(function,iterable)
- generator expression syntax:
(expression for item in iterable if conditional)
When the logic for list comprehension/a generator expression is too complex to make them understandable, it’s better to use map()
.
In the map()
function, we keep the function separate, which makes the design very clear.
Another difference between map()
and generator expressions is in the map()
function. We can reuse a lambda
or function definition.
map() vs. starmap()
Applying a function to two iterables using map()
:
https://gist.github.com/BetterProgramming/b130b452a8e88ee3b6895d09f6a4ac2b#file-mapiter-py
According to the Python docs, starmap()
is “used instead of map()
when argument parameters are already grouped in tuples from a single iterable (the data has been ‘pre-zipped’).”

https://gist.github.com/BetterProgramming/0ce39af038f67a9a1b5ec4785d873b1a#file-starmap-py
filter()
filter()
is used to filter the elements from the iterable that matches a certain condition:
filter(function,iterable)
Filtering the even numbers in the iterable:
https://gist.github.com/BetterProgramming/f7e34a98d5f27db3fec888f91f337a17#file-evenfilter-py
filter()
also returns a filter object that is an iterator. It doesn’t immediately go over all the elements but will return the next value when we ask for it
( using next()
):
https://gist.github.com/BetterProgramming/7ff050ce032c662f0c09e4db3a0344fe#file-filternext-py
If we want all results at once, we can convert it to a list using the list()
constructor.
filter() vs. filterfalse()
“Make an iterator that filters elements from iterable returning only those for which the predicate is False.” — Python docs
filterfalse(predicate,iterable)
Filtering the elements that don’t start with “r”
:
https://gist.github.com/BetterProgramming/f2a5517d793d9256e8bd6d83a122fd7f#file-filternor-py
reduce()
The functools
module provides the following function:
functools.reduce()
Applying the function of two arguments cumulatively to the items of the iterable, from left to right, so as to reduce the iterable to a single value:
functools.reduce(function, iterable)
Python provides some built-in reducing functions:
max()
returns the largest number in the iterable.min()
returns the smallest number in the iterable.sum()
returns the sum of all numbers in an iterable.
Finding the product of all elements on the list:
from functools import reduce
num1=[1,2,3,4,5]
num2=reduce(lambda x,y:x*y,num1)
print (num2)#Output:120
reduce() vs. accumulate()
“Make an iterator that returns accumulated sums, or accumulated results of other binary functions (specified via the optional func argument).” — Python docs
itertools.accumulate(iterable.function)
Finding the product of the numbers in an iterable and finding the running accumulated value:

https://gist.github.com/BetterProgramming/0c8c378b216217b9c3f80e1c7480f4bf#file-accumulate-py
Key Takeaways
map()
: Apply function to each element in the iterable (map(function,iterable)
).filter()
: Filter the elements in the iterable that match certain conditions(filter(function,iterable)
).reduce()
: It takes successive elements in the iterable and combines them in a certain way (reduce(function,iterable)
).
Watch this space for more articles on Python and DataScience. If you like to read more of my tutorials, follow me on Medium, LinkedIn, Twitter.
Make a one-time donation
Make a monthly donation
Make a yearly donation
Choose an amount
Or enter a custom amount
Your contribution is appreciated.
Your contribution is appreciated.
Your contribution is appreciated.
Buy Me a CoffeeBuy Me a CoffeeBuy Me a Coffee