The problems (and also the solutions) are fairly simple to understand. You will find more than several organizations including the likes of Microsoft and Facebook ask you to solve this problem over the phone interview. Both the problems are generally approached with the two-pointer technique.

Two-pointer technique consists of running two pointers at a different pace and/or other direction. This post works as a quick cheat sheet for your to revise. It is not meant for a detailed explanation.

## Table of contents

**Two sum:**

❓ The problem

🏎 Time optimized; T: O(n), S: O(n)

🧐 Space optimized; T: O(n), S: O(1)

**Three sum:**

❓ The problem

🤔 Solution; T: O(n), S: O(n) to O(1)

## Two sum: the problem

Given a list of numbers and a target, the sum of a pair of numbers equals the target. Example:

`Numbers: 12 21 10 11 13 25`

Target: 35

Output: 10 25

## Two sum: time optimized; T: O(n), S: O(n)

Hash table's complexity for insertion and search is `O(1)`

. Leveraging its capability, a map can be created to store `target - element`

for each element. When an element matches such a stored-value, both of them make up the pair that adds up to the target. Both the time and space complexity are `O(n)`

, because in the worst case the hashmap could be as long as the numbers array.

`nums = [12, 21, 10, 11, 13, 25]`

num_map, target = {}, 35

for i in range(len(nums)):

num = target - nums[i]

if (num in num_map):

print(num, nums[i])

break

else:

num_map[nums[i]] = num

## Two sum: space optimized; T: O(n), S: O(1)

When we intend to reduce memory footprint, sorting is a reasonable consideration. However, sorting algorithms also take space. In this case, algorithm stability is unnecessary because we are working with numbers, and their original order after sorting doesn't matter. That leaves us with Heapsort and more exotic algorithms like Smoothsort and Block sort. They all have a `O(nlogn)`

time and `O(1)`

space complexity. The stable sorting alternatives have `O(n)`

space complexity, which doesn't improve upon the previous solution.

The time complexity remains `O(n)`

. Because the numbers are now sorted at the beginning, we can now take advantage of their order. The two indices `left`

and `right`

in turn start from the first element (0-th position in the array) and last element (the last position in the array). We then sum both the elements and, based on its difference with the target, we move the indices.

If the sum is an exact match with the target, we print output and exit. If the target is higher than the sum, we move the `left`

index to the next bigger element in the array. Otherwise, we decrease the `right`

index to consider a smaller number in the array, try and match the sum with the target again. We continue this until `left`

and `right`

does a crossover.

`import heapq `

# In real interviews, implement your own heap

def heapsort(nums):

heapq.heapify(nums)

return [heapq.heappop(nums) for i in range(len(nums))]

nums = heapsort([12, 21, 10, 11, 13, 25])

target, left, right = 35, 0, len(nums) - 1

while (left < right):

s = nums[left] + nums[right]

if (target == s):

print(nums[left], nums[right])

break

elif (s < target):

left += 1

else:

right -= 1

## Three sum: the problem

Give a list of numbers, and a target, the sum of a triplet from the given numbers equals the target.

`Numbers: 12 21 10 11 13 25`

Target: 56

Output: 21 10 25

## Three sum: T: O(n²), S: O(n) to O(1)

There are three indices and two loop at play here. The main index `i`

loops over the numbers from `0`

(0-th position in the array) to `size of the numbers - 2`

. This is the slowest loop. It is like the solar system. The farther you are from the sun, the slower you are to complete an orbit.

The other two indices are `left`

and `right`

and in turn they start from the second element (1st position in the array) and last element (the last position in the array). This inner loop is the fastest.

The idea here is that if we assume the current element in the outer loop is part of the answer, we move the inner loop to identify the other two elements in the solution. If that is not the case, we move the outer loop forward to the next element and check again.

We run the inner loop until the `left`

and `right`

does a crossover. If the sum of `i-th`

, `left-th`

and `right-th`

element is less than the `target`

we move the `left`

index forward. Otherwise, we move the `right`

index backward. This works because we already sorted the array.

Timsort is many language's default sorting implementation (including Python). In that case, the space complexity is `O(n)`

. As explained in the previous solution, a Heapsort can reduce the space complexity to `O(1)`

while keeping the time complexity the same at the cost of writing more code. However, the total time complexity is `O(n²)`

solely for the loops.

`target, triplet = 56, None`

nums.sort()

for i in range(len(nums) - 2):

left = i + 1

right = len(nums) - 1

while (left < right):

s = nums[i] + nums[left] + nums[right]

if (s == target):

triplet = (nums[i], nums[left], nums[right])

break

elif (s < target):

left += 1

else:

right -= 1

if triplet:

break

print(triplet)

Can the time and space complexity be optimized more?

## Conclusion

The problems we solved here do not necessarily warrant two-pointer techniques, but optimal solutions were searched for using it in this post.