After spending time working through the NeetCode 150 list, I decided to start organizing my notes — beginning here with arrays. While I’m rescanning the NeetCode 150 list, I’m refining my notes and updating this document with clearer examples, key takeaways, and helpful patterns I didn’t catch the first time. Everything here is written in Python since that's what I’m using to solve the problems.
These are things I reviewed, things I found useful, and examples that helped things click. Hopefully, they’ll be useful for you too.
Source: Neetcode platform (https://neetcode.io/practice), personal study sessions, coaching guidances, and A Common-Sense Guide to Data Structures and Algorithms by Jay Wengrow
Arrays are one of the most basic and fundamental data structures. Arrays are stored contiguously in memory. There are two types of arrays: static
and dynamic
.
Static Arrays
Think of static arrays like a row of lockers. You know exactly how many you need and what will go into each one ahead of time.
In statically typed languages like Java, C++, and C#, arrays must be initialized with a fixed size and type. These are known as static arrays.
Once declared, their size cannot change. If you try to add more elements than the capacity allows, it will fail.
Key operations for static arrays:
-
Reading: Accessing a value by index is
O(1)
. -
Traversing: Going through the whole array is
O(n)
. - Insertion/Deletion at the End: Often done with an overwrite and a length adjustment.
-
Insertion/Deletion in the Middle: Requires shifting elements and is
O(n)
.
Dynamic Arrays
Dynamic arrays are a more flexible alternative to static arrays. In Python and JavaScript, lists and arrays grow dynamically under the hood.
You don't need to specify a size upfront. Internally, when a dynamic array reaches capacity, it creates a new array (usually double the size), copies the elements over, and appends the new one. This copying step is O(n)
, but it happens infrequently. So, the average time to insert an item is still O(1)
— this is called amortized time.
Here’s a Python-style explanation of how this works:
def push_back(arr, val, capacity):
if len(arr) == capacity:
capacity *= 2
new_arr = [0] * capacity
for i in range(len(arr)):
new_arr[i] = arr[i]
arr = new_arr
arr.append(val)
Why double the size? Because doubling keeps the total number of copies over time bounded. Creating an array of size
n
might take up to2n
operations overall, but that still averages toO(1)
per append in the long run.
Note: Inserting or deleting at the middle of a dynamic array requires shifting — which makes those operations O(n)
, just like static arrays.
Reading from an Array
Accessing an element in an array with an index is O(1)
because each index is directly mapped to a memory address.
array = [1, 2, 3, 4, 5, 6, 7, 8]
i = 4
array[i] # => 5
Inserting Elements into an Array
Appending is efficient and commonly used when building arrays iteratively.
Inserting at an arbitrary index is O(n)
:
array.insert(0, 4) # => [4, 1, 2, 3]
array.insert(1, 4) # => [1, 4, 2, 3]
While less common than appending, inserting into specific positions can be useful in certain algorithms like merging intervals, or when maintaining a custom order. However, it's important to note that each insert may require shifting all elements to the right, which is what makes it O(n)
.
Appending to the end is O(1)
:
array = [1, 2, 3]
array.append(4) # => [1, 2, 3, 4]
Inserting at an arbitrary index is O(n)
:
array.insert(0, 4) # => [4, 1, 2, 3]
array.insert(1, 4) # => [1, 4, 2, 3]
Deleting Elements from an Array
Removing from the end is O(1)
:
array.pop() # => 3
Removing from an arbitrary index is O(n)
:
array.pop(0) # => 1
array.pop(1) # => 2
Array Traversal
for i in range(len(array)):
print(array[i])
i = 0
while i < len(array):
print(array[i])
i += 1
Matrix Traversal
A 2x2 matrix example:
matrix = [[1, 2], [3, 4]]
for i in range(len(matrix)):
for j in range(len(matrix[i])):
print(matrix[i][j])
Runtimes
Operation | Time Complexity |
---|---|
Read/write ith element | O(1) |
Insert/remove end | O(1) |
Insert/remove ith element | O(n) |
Insert/remove beginning | O(n) |
These complexities depend on how much shifting or re-indexing is needed. For example, inserting or removing at the beginning or middle of an array requires shifting all subsequent elements, which is why it's O(n).
Python List Essentials like Two Sum, Product Except Self, or Merge Intervals, I wanted to collect the core concepts here as a quick reference for myself.
If you're also practicing DSA in Python, feel free to follow along. I'm adding more notes and solutions to this GitHub repo as I go..
Python List Essentials
List Comprehension
List comprehensions offer a concise way to create lists.
# Squares of even numbers from 0 to 9
squares = [x ** 2 for x in range(10) if x % 2 == 0]
`
List Slicing
`python
Slicing: [start:stop:step]
a = [1, 2, 3, 4, 5]
print(a[1:4]) # [2, 3, 4]
print(a[::2]) # [1, 3, 5]
print(a[::-1]) # [5, 4, 3, 2, 1]
`
Common List Methods
Method | Description |
---|---|
append(x) |
Add item to end |
insert(i, x) |
Insert at specific index |
pop() |
Remove and return last item |
remove(x) |
Remove first occurrence of value |
index(x) |
Return index of first occurrence |
count(x) |
Count number of occurrences |
sort() |
Sort the list in place |
reverse() |
Reverse the list in place |
copy() |
Return shallow copy of the list |
These complexities depend on how much shifting or re-indexing is needed. For example, inserting or removing at the beginning or middle of an array requires shifting all subsequent elements, which is why it's O(n).
Operation | Time Complexity |
---|---|
Read/write ith element | O(1) |
Insert/remove end | O(1) |
Insert/remove ith element | O(n) |
Insert/remove beginning | O(n) |
🔍 Interview Tricks & Patterns I Found Useful
Here are a few Python-based DSA patterns that show up often in interviews. These are handy to keep in your toolkit when solving array-related problems.
Binary Search on Sorted Arrays
import bisect
arr = [1, 3, 4, 6, 8]
index = bisect.bisect_left(arr, 4) # => 2
List Copy vs Reference
arr = [1, 2, 3]
arr2 = arr # Reference (changes affect both)
arr3 = arr[:] # Shallow copy (safe duplicate)
Sliding Window Pattern
def max_subarray_sum(nums, k):
max_sum = curr = sum(nums[:k])
for i in range(k, len(nums)):
curr += nums[i] - nums[i - k]
max_sum = max(max_sum, curr)
return max_sum
Two Pointers for Target Sum
def has_pair_with_sum(arr, target):
arr.sort()
left, right = 0, len(arr) - 1
while left < right:
s = arr[left] + arr[right]
if s == target:
return True
elif s < target:
left += 1
else:
right -= 1
return False
Top comments (0)