feat(datastructures, trees): tree traversal algorithms#169
feat(datastructures, trees): tree traversal algorithms#169BrianLusina wants to merge 11 commits intomainfrom
Conversation
📝 WalkthroughWalkthroughAdds a new, standalone BinarySearchTree implementation and related utilities, exposes only public BST symbols from the package, augments BinaryTree with two range/univalue helpers, and introduces many new algorithm problems, implementations, and tests across backtracking, dynamic programming, graphs, intervals, and tree utilities. DIRECTORY.md updated accordingly. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested labels
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 9
🤖 Fix all issues with AI agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py`:
- Around line 11-14: The iterator stack is created empty in __init__ but never
populated, so next() and has_next() fail; update the BinarySearchTree __init__
to populate the iterator by calling __leftmost_inorder(root) (i.e., invoke the
class's leftmost inorder initializer after creating DynamicSizeStack) so the
stack contains the leftmost path from root, ensuring has_next() and next()
behave immediately; reference __init__, __leftmost_inorder, next, has_next,
DynamicSizeStack, BinaryTreeNode and mirror how BinarySearchTreeIterator
initializes its stack.
- Around line 789-800: The loop mutates the tree by reassigning self.root while
searching for the LCA (you use self.root = self.root.left / self.root =
self.root.right and finally return self.root); fix it by using a local traversal
pointer (e.g., current = self.root) inside the method that checks current.data
against node_one.data and node_two.data, move current to current.left or
current.right as needed, and return current (not self.root) so the tree's root
remains unchanged; ensure all references to left/right comparisons and the final
returned node use that local variable.
- Around line 218-224: The has_next() method currently returns
self.stack.is_empty() which is inverted; update has_next (in class
BinarySearchTree iterator logic) to return the opposite boolean (i.e., True when
self.stack is not empty) so the iterator reports there are remaining items;
locate the has_next method referencing self.stack and change the return to the
negation of is_empty (return not self.stack.is_empty()) to fix the logic.
- Around line 404-422: The breadth_first_search method currently enqueues and
dequeues nodes but never collects or returns values; modify breadth_first_search
to create a results list (e.g., results = []), when dequeuing a node append its
value (e.g., current_node.value or current_node.data depending on your node
field) to results, continue enqueuing left/right children as before, and after
the loop return the results list; ensure you handle the case where self.root is
None by returning an empty list and use the existing FifoQueue, current_node,
and self.root symbols to locate the code to update.
- Around line 349-354: The loop incorrectly computes current_diff using
self.root.data instead of the traversing node; update the calculation to use
current.data (i.e., compute current_diff = abs(target - current.data)) so
min_diff and closest_node are compared/updated against the node actually being
visited (variables: current, min_diff, closest_node); make the single-line
change where current_diff is assigned and ensure no other references still use
self.root.data inside this traversal.
- Around line 690-708: merge_trees currently always operates on self.root and
calls itself with other_node children, corrupting recursion; implement a helper
function (e.g., _merge_nodes(self_node: BinaryTreeNode, other_node:
BinaryTreeNode) -> BinaryTreeNode) that merges two corresponding nodes: if one
node is None return the other, if both present sum data into self_node.data,
then set self_node.left = _merge_nodes(self_node.left, other_node.left) and
self_node.right = _merge_nodes(self_node.right, other_node.right); change
merge_trees to call and return _merge_nodes(self.root, other_node) so merging
correctly traverses both trees.
- Around line 506-534: The in_order_morris_traversal implementation is
corrupting the tree by permanently nulling left links (temp.left = None); update
the function to implement the standard Morris algorithm: for each node current,
if current.left is None append current.data and move current = current.right;
otherwise find the inorder predecessor pre = rightmost node in current.left and
if pre.right is None set pre.right = current and move current = current.left,
else (pre.right == current) set pre.right = None, append current.data and move
current = current.right. Remove the permanent left-clearing (temp.left = None)
and use the predecessor's right pointer restoration to leave the tree intact
(references: in_order_morris_traversal, current, pre, temp).
- Around line 468-485: The in_order_recurse method is wrong because it ignores
its node parameter and always uses self.root; change it to use the passed node
as the recursion root, add a base-case check for None, call
self.in_order_recurse(node.left) and self.in_order_recurse(node.right) and
aggregate their returned lists (extend or concat) into result before and after
appending node.data, and ensure the function returns the aggregated result;
update references in the function body (notably in_order_recurse and any uses of
self.root) to use node instead.
In `@datastructures/trees/binary/tree/binary_tree.py`:
- Around line 947-950: The DFS pruning is wrong for an unordered BinaryTree:
inside the inner function dfs(node: Optional[BinaryTreeNode]) stop-checking on
node.data being out of [low, high] must be removed because children can still be
in-range; instead, change dfs to only return 0 for a None node, recursively sum
dfs(node.left) and dfs(node.right), and then add node.data to the sum only if
low <= node.data <= high (referencing dfs, BinaryTreeNode, low, high).
| def __init__(self, root: Optional[BinaryTreeNode] = None): | ||
| super().__init__(root) | ||
| self.stack = DynamicSizeStack() | ||
|
|
There was a problem hiding this comment.
Iterator stack not initialized with tree nodes.
The constructor creates an empty stack, but next() and has_next() expect the stack to be populated via __leftmost_inorder(). Unlike BinarySearchTreeIterator (which calls __leftmost_inorder(root) in its constructor), this class leaves the stack empty, so calling next() will fail immediately.
Consider either:
- Calling
__leftmost_inorder(root)in the constructor if iteration is expected immediately - Documenting that users must explicitly initialize the iterator state
🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
11 - 14, The iterator stack is created empty in __init__ but never populated, so
next() and has_next() fail; update the BinarySearchTree __init__ to populate the
iterator by calling __leftmost_inorder(root) (i.e., invoke the class's leftmost
inorder initializer after creating DynamicSizeStack) so the stack contains the
leftmost path from root, ensuring has_next() and next() behave immediately;
reference __init__, __leftmost_inorder, next, has_next, DynamicSizeStack,
BinaryTreeNode and mirror how BinarySearchTreeIterator initializes its stack.
| def has_next(self) -> bool: | ||
| """ | ||
| Checks if the BST has items left. Since this uses a stack, then we simply check if the stack still has items. | ||
| This is used in an iterator approach to getting items from the BST. This returns True if there are items & False | ||
| otherwise, the Time Complexity here is O(1) | ||
| """ | ||
| return self.stack.is_empty() |
There was a problem hiding this comment.
Inverted return value in has_next().
The method returns self.stack.is_empty(), which is True when the stack is empty. However, has_next() should return True when there are more items to iterate, not when the stack is empty.
🐛 Proposed fix
def has_next(self) -> bool:
"""
Checks if the BST has items left. Since this uses a stack, then we simply check if the stack still has items.
This is used in an iterator approach to getting items from the BST. This returns True if there are items & False
otherwise, the Time Complexity here is O(1)
"""
- return self.stack.is_empty()
+ return not self.stack.is_empty()🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
218 - 224, The has_next() method currently returns self.stack.is_empty() which
is inverted; update has_next (in class BinarySearchTree iterator logic) to
return the opposite boolean (i.e., True when self.stack is not empty) so the
iterator reports there are remaining items; locate the has_next method
referencing self.stack and change the return to the negation of is_empty (return
not self.stack.is_empty()) to fix the logic.
| while current: | ||
| current_diff = abs(target - self.root.data) | ||
|
|
||
| if current_diff < min_diff: | ||
| min_diff = current_diff | ||
| closest_node = current |
There was a problem hiding this comment.
Bug: Comparing against wrong node in the traversal loop.
Line 350 computes current_diff = abs(target - self.root.data) but should use current.data. The loop traverses via current, but the difference is always calculated against self.root, so min_diff and closest_node will rarely update correctly.
🐛 Proposed fix
while current:
- current_diff = abs(target - self.root.data)
+ current_diff = abs(target - current.data)
if current_diff < min_diff:📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| while current: | |
| current_diff = abs(target - self.root.data) | |
| if current_diff < min_diff: | |
| min_diff = current_diff | |
| closest_node = current | |
| while current: | |
| current_diff = abs(target - current.data) | |
| if current_diff < min_diff: | |
| min_diff = current_diff | |
| closest_node = current |
🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
349 - 354, The loop incorrectly computes current_diff using self.root.data
instead of the traversing node; update the calculation to use current.data
(i.e., compute current_diff = abs(target - current.data)) so min_diff and
closest_node are compared/updated against the node actually being visited
(variables: current, min_diff, closest_node); make the single-line change where
current_diff is assigned and ensure no other references still use self.root.data
inside this traversal.
| def breadth_first_search(self) -> List[Any]: | ||
| """ | ||
| Performs a breadth first search through a Binary Tree | ||
| This will traverse the tree level by level and depth by depth. Using a Queue to put elements into the queue | ||
| """ | ||
| queue = FifoQueue() | ||
|
|
||
| # start off by adding the root node | ||
| queue.enqueue(self.root) | ||
|
|
||
| # while the queue is not empty, we want to traverse the tree and add elements to the queue, | ||
| while not queue.is_empty(): | ||
| current_node = queue.dequeue() | ||
|
|
||
| if current_node.left: | ||
| queue.enqueue(current_node.left) | ||
|
|
||
| if current_node.right: | ||
| queue.enqueue(current_node.right) |
There was a problem hiding this comment.
Method has return type List[Any] but returns nothing.
The method traverses the tree but doesn't collect or return any results, despite the return type annotation indicating it should return a list.
🔧 Proposed fix
def breadth_first_search(self) -> List[Any]:
"""
Performs a breadth first search through a Binary Tree
This will traverse the tree level by level and depth by depth. Using a Queue to put elements into the queue
"""
+ result = []
queue = FifoQueue()
# start off by adding the root node
queue.enqueue(self.root)
# while the queue is not empty, we want to traverse the tree and add elements to the queue,
while not queue.is_empty():
current_node = queue.dequeue()
+ result.append(current_node.data)
if current_node.left:
queue.enqueue(current_node.left)
if current_node.right:
queue.enqueue(current_node.right)
+
+ return result📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def breadth_first_search(self) -> List[Any]: | |
| """ | |
| Performs a breadth first search through a Binary Tree | |
| This will traverse the tree level by level and depth by depth. Using a Queue to put elements into the queue | |
| """ | |
| queue = FifoQueue() | |
| # start off by adding the root node | |
| queue.enqueue(self.root) | |
| # while the queue is not empty, we want to traverse the tree and add elements to the queue, | |
| while not queue.is_empty(): | |
| current_node = queue.dequeue() | |
| if current_node.left: | |
| queue.enqueue(current_node.left) | |
| if current_node.right: | |
| queue.enqueue(current_node.right) | |
| def breadth_first_search(self) -> List[Any]: | |
| """ | |
| Performs a breadth first search through a Binary Tree | |
| This will traverse the tree level by level and depth by depth. Using a Queue to put elements into the queue | |
| """ | |
| result = [] | |
| queue = FifoQueue() | |
| # start off by adding the root node | |
| queue.enqueue(self.root) | |
| # while the queue is not empty, we want to traverse the tree and add elements to the queue, | |
| while not queue.is_empty(): | |
| current_node = queue.dequeue() | |
| result.append(current_node.data) | |
| if current_node.left: | |
| queue.enqueue(current_node.left) | |
| if current_node.right: | |
| queue.enqueue(current_node.right) | |
| return result |
🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
404 - 422, The breadth_first_search method currently enqueues and dequeues nodes
but never collects or returns values; modify breadth_first_search to create a
results list (e.g., results = []), when dequeuing a node append its value (e.g.,
current_node.value or current_node.data depending on your node field) to
results, continue enqueuing left/right children as before, and after the loop
return the results list; ensure you handle the case where self.root is None by
returning an empty list and use the existing FifoQueue, current_node, and
self.root symbols to locate the code to update.
| def in_order_recurse(self, node: BinaryTreeNode) -> List[T]: | ||
| """ | ||
| Another type of Depth First Search (DFS) that traverses the tree from the left to middle to right of the tree. | ||
| This type of search will begin at the left node and check if that node has a left child and continually check | ||
| until that left node is a leaf(has no children) and will then print its data and "bubble up" back to the | ||
| current node and execute that (in this case print it) and then print the right node. The same procedure is | ||
| executed for the right side of the tree. | ||
| """ | ||
| result = [] | ||
| if self.root: | ||
| if self.root.left: | ||
| self.in_order_recurse(self.root.left) | ||
|
|
||
| result.append(self.root.data) | ||
|
|
||
| if self.root.right: | ||
| self.in_order_recurse(self.root.right) | ||
| return result |
There was a problem hiding this comment.
Method ignores node parameter and uses self.root instead.
The method accepts a node parameter but never uses it—all operations are on self.root. This causes incorrect traversal since recursive calls don't progress through the tree structure.
🐛 Proposed fix
def in_order_recurse(self, node: BinaryTreeNode) -> List[T]:
- """
- Another type of Depth First Search (DFS) that traverses the tree from the left to middle to right of the tree.
- ...
- """
result = []
- if self.root:
- if self.root.left:
- self.in_order_recurse(self.root.left)
-
- result.append(self.root.data)
-
- if self.root.right:
- self.in_order_recurse(self.root.right)
+ if node:
+ result.extend(self.in_order_recurse(node.left))
+ result.append(node.data)
+ result.extend(self.in_order_recurse(node.right))
return result🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
468 - 485, The in_order_recurse method is wrong because it ignores its node
parameter and always uses self.root; change it to use the passed node as the
recursion root, add a base-case check for None, call
self.in_order_recurse(node.left) and self.in_order_recurse(node.right) and
aggregate their returned lists (extend or concat) into result before and after
appending node.data, and ensure the function returns the aggregated result;
update references in the function body (notably in_order_recurse and any uses of
self.root) to use node instead.
| def in_order_morris_traversal(self) -> List[Any]: | ||
| result = [] | ||
| current = self.root | ||
| pre = None | ||
|
|
||
| while current: | ||
| if not current.left: | ||
| # add the current data of the node | ||
| result.append(current.data) | ||
| # Move to next right node | ||
| current = current.right | ||
| else: | ||
| # we have a left subtree | ||
| pre = current.left | ||
|
|
||
| # find rightmost | ||
| while pre.right: | ||
| pre = pre.right | ||
|
|
||
| # put current after the pre node | ||
| pre.right = current | ||
| # store current node | ||
| temp = current | ||
| # move current to top of new tree | ||
| current = current.left | ||
| # original current left be None, avoid infinite loops | ||
| temp.left = None | ||
|
|
||
| return result |
There was a problem hiding this comment.
Morris traversal permanently corrupts the tree structure.
The standard Morris traversal creates temporary threaded links and removes them after visiting. This implementation sets temp.left = None (line 532), permanently destroying the left subtree links. After calling this method, the tree's left branches are lost.
🐛 Proposed fix with proper link restoration
def in_order_morris_traversal(self) -> List[Any]:
result = []
current = self.root
- pre = None
while current:
if not current.left:
- # add the current data of the node
result.append(current.data)
- # Move to next right node
current = current.right
else:
- # we have a left subtree
pre = current.left
-
- # find rightmost
- while pre.right:
+ # Find rightmost node in left subtree (or thread back to current)
+ while pre.right and pre.right != current:
pre = pre.right
- # put current after the pre node
- pre.right = current
- # store current node
- temp = current
- # move current to top of new tree
- current = current.left
- # original current left be None, avoid infinite loops
- temp.left = None
+ if not pre.right:
+ # Create thread
+ pre.right = current
+ current = current.left
+ else:
+ # Thread exists, remove it and visit current
+ pre.right = None
+ result.append(current.data)
+ current = current.right
return result🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
506 - 534, The in_order_morris_traversal implementation is corrupting the tree
by permanently nulling left links (temp.left = None); update the function to
implement the standard Morris algorithm: for each node current, if current.left
is None append current.data and move current = current.right; otherwise find the
inorder predecessor pre = rightmost node in current.left and if pre.right is
None set pre.right = current and move current = current.left, else (pre.right ==
current) set pre.right = None, append current.data and move current =
current.right. Remove the permanent left-clearing (temp.left = None) and use the
predecessor's right pointer restoration to leave the tree intact (references:
in_order_morris_traversal, current, pre, temp).
| def merge_trees(self, other_node: BinaryTreeNode) -> BinaryTreeNode: | ||
| """ | ||
| Merges this tree with another tree given another node | ||
| :param other_node Other Root node, may be None, therefore we return the root node if availables | ||
| :type BinaryTreeNode | ||
| :returns Binary Tree Node | ||
| """ | ||
| if not other_node: | ||
| return self.root | ||
|
|
||
| if not self.root: | ||
| return other_node | ||
|
|
||
| self.root.data += other_node.data | ||
|
|
||
| self.root.left = self.merge_trees(other_node.left) | ||
| self.root.right = self.merge_trees(other_node.right) | ||
|
|
||
| return self.root |
There was a problem hiding this comment.
Broken recursion in merge_trees.
The recursive calls self.merge_trees(other_node.left) and self.merge_trees(other_node.right) are incorrect. The method always operates on self.root, so the recursion doesn't traverse self's subtrees—only other_node's. The left/right children of self.root are overwritten rather than merged.
🐛 Proposed fix using a helper function
def merge_trees(self, other_node: BinaryTreeNode) -> BinaryTreeNode:
"""
Merges this tree with another tree given another node
- :param other_node Other Root node, may be None, therefore we return the root node if availables
- :type BinaryTreeNode
- :returns Binary Tree Node
"""
- if not other_node:
- return self.root
-
- if not self.root:
- return other_node
-
- self.root.data += other_node.data
-
- self.root.left = self.merge_trees(other_node.left)
- self.root.right = self.merge_trees(other_node.right)
-
- return self.root
+ def merge_helper(node1: Optional[BinaryTreeNode], node2: Optional[BinaryTreeNode]) -> Optional[BinaryTreeNode]:
+ if not node1:
+ return node2
+ if not node2:
+ return node1
+ node1.data += node2.data
+ node1.left = merge_helper(node1.left, node2.left)
+ node1.right = merge_helper(node1.right, node2.right)
+ return node1
+
+ self.root = merge_helper(self.root, other_node)
+ return self.root🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/binary_search_tree.py` around lines
690 - 708, merge_trees currently always operates on self.root and calls itself
with other_node children, corrupting recursion; implement a helper function
(e.g., _merge_nodes(self_node: BinaryTreeNode, other_node: BinaryTreeNode) ->
BinaryTreeNode) that merges two corresponding nodes: if one node is None return
the other, if both present sum data into self_node.data, then set self_node.left
= _merge_nodes(self_node.left, other_node.left) and self_node.right =
_merge_nodes(self_node.right, other_node.right); change merge_trees to call and
return _merge_nodes(self.root, other_node) so merging correctly traverses both
trees.
| def dfs(node: Optional[BinaryTreeNode]): | ||
| if not node or node.data < low or node.data > high: | ||
| return 0 | ||
| return node.data + dfs(node.left) + dfs(node.right) |
There was a problem hiding this comment.
Incorrect pruning logic for general binary trees.
The current implementation returns 0 and stops traversing children when node.data is outside the range. This is incorrect for a general BinaryTree because children are not ordered—a child node may still have a value within [low, high] even if its parent does not.
This logic would be correct for a BST (as seen in BinarySearchTree.sum_nodes_in_range), but BinaryTree has no ordering guarantees.
🐛 Proposed fix
def dfs(node: Optional[BinaryTreeNode]):
- if not node or node.data < low or node.data > high:
+ if not node:
return 0
- return node.data + dfs(node.left) + dfs(node.right)
+ current_sum = node.data if low <= node.data <= high else 0
+ return current_sum + dfs(node.left) + dfs(node.right)🤖 Prompt for AI Agents
In `@datastructures/trees/binary/tree/binary_tree.py` around lines 947 - 950, The
DFS pruning is wrong for an unordered BinaryTree: inside the inner function
dfs(node: Optional[BinaryTreeNode]) stop-checking on node.data being out of
[low, high] must be removed because children can still be in-range; instead,
change dfs to only return 0 for a None node, recursively sum dfs(node.left) and
dfs(node.right), and then add node.data to the sum only if low <= node.data <=
high (referencing dfs, BinaryTreeNode, low, high).
There was a problem hiding this comment.
Actionable comments posted: 12
🤖 Fix all issues with AI agents
In `@algorithms/backtracking/letter_tile_possibilities/README.md`:
- Around line 3-73: The README currently describes a factorial/count-based
method but the implementation uses set(permutations(...)); either update the
README to describe the implemented approach (explicitly mention using Python's
permutations and de-duplication via set(permutations(...)) and its performance
trade-offs) or change the implementation to the factorial combinatorics
approach; also reword the ambiguous sentence "each sequence must include at most
one tile, tiles[i]" to "each tile may be used at most once per sequence", and
correct the formula to show the product over repeated letter counts as n! / (∏
c_i!) where c_i are counts of each repeated letter.
In `@algorithms/dynamic_programming/frog_jump/README.md`:
- Around line 82-84: Update the Space Complexity section to mention both the
mapper and the DP table: state that the mapper/map uses O(n) extra space and
explicitly call out the 2D dp array (named dp) allocated as 1001 x 1001; note
that this fixed-size table can be described as O(1) space if treating the bounds
as constants or O(n²) if you view the table size as dependent on input
magnitude—include both interpretations and clarify which one the README adopts.
In `@algorithms/dynamic_programming/max_profit_in_job_scheduling/__init__.py`:
- Around line 15-20: Precompute the list of end times once before the loop and
use it in bisect_right to avoid rebuilding [job[1] for job in jobs] on every
iteration: create end_times = [job[1] for job in jobs] before the for i in
range(1, job_len + 1) loop and replace bisect_right([job[1] for job in jobs],
start) with bisect_right(end_times, start); also change the unpacking inside the
loop from "start, end, profit = jobs[i - 1]" to "start, end, p = jobs[i - 1]"
(and use p when updating dp) to avoid shadowing the function parameter named
profit.
In `@algorithms/dynamic_programming/max_profit_in_job_scheduling/README.md`:
- Line 1: The top-level markdown heading "# Maximum Profit in Job Scheduling"
has two spaces after the hash which violates MD019; change it to a single space
by updating the heading to "# Maximum Profit in Job Scheduling" (locate the
header text "Maximum Profit in Job Scheduling" in README.md and remove the extra
space after the '#' so there is exactly one space).
In `@algorithms/dynamic_programming/maximal_rectangle/README.md`:
- Around line 69-70: The README contains a typo: when matrix[i][j] is '0' the
text says "right[j] is reset to ~" but the implementation uses n (or cols);
update the sentence to state that right[j] is reset to n (or cols) and
current_right is set to j, referencing the variables right[j], current_right,
matrix[i][j], and n/cols so it matches the algorithm implementation.
In `@algorithms/dynamic_programming/word_break/test_word_break.py`:
- Around line 45-47: The test suite contains a duplicate test tuple for the
input "applepenapple" with dictionary ["apple", "pen"] (the repeated entry in
test_word_break.py); remove the redundant tuple from the test cases list so each
scenario appears only once (locate the duplicate tuple in the list of test
inputs and delete the second occurrence).
In `@algorithms/graphs/network_delay_time/__init__.py`:
- Around line 19-31: The early-return check "if k not in graph: return -1" in
network_delay_time is wrong for sink start nodes and the unused vertices set is
unnecessary; remove that early exit, delete the unused "vertices" set population
(or stop populating it), and instead add a single special-case: if n == 1 return
0; let the normal shortest-path/Dijkstra logic operate when k has no outgoing
edges so reachability is determined by the algorithm rather than the incorrect
graph key check.
In `@algorithms/graphs/network_delay_time/test_network_delay_time.py`:
- Around line 39-45: The test case tuple's human-readable description
("times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=1") does not match the actual
k argument (2); update the description string to use k=2 so it accurately
reflects the inputs in the tuple (the entry containing [[1, 2, 1], [2, 3, 2],
[3, 4, 3], [4, 1, 4]], 4, 2, 9), or alternatively change the numeric k to 1 if
the intended case was k=1—ensure the text and the value in the test tuple (the
test case tuple in test_network_delay_time.py) are consistent.
In `@algorithms/intervals/data_stream/README.md`:
- Around line 114-115: Update the complexity note header by hyphenating
"Worst-case" (change "Worst case relation to n:" to "Worst-case relation to n:")
in the README line that discusses Add Num(int value) and Get Intervals()
complexity so the grammar is correct; leave the rest of the sentence and
references to Add Num and Get Intervals() unchanged.
In `@datastructures/trees/binary/search_tree/bst_utils.py`:
- Around line 17-30: The current dfs uses float("-inf")/float("inf") as
sentinels which fails for non-numeric T; change dfs to accept Optional[T] bounds
(e.g., dfs(node: Optional[BinaryTreeNode], min_value: Optional[T], max_value:
Optional[T]) -> bool) and replace comparisons with guarded checks (if min_value
is not None and node.data <= min_value: return False; if max_value is not None
and node.data >= max_value: return False), recurse as dfs(node.left, min_value,
node.data) and dfs(node.right, node.data, max_value), and start with dfs(root,
None, None).
In `@datastructures/trees/binary/search_tree/README.md`:
- Around line 453-463: The README text uses min_ and max_ which can be parsed as
emphasis in some Markdown renderers; update the wording to wrap the parameter
names in inline code formatting (e.g., `min_` and `max_`) wherever they
appear—including the phrases "Pass the current node's value as the new max_
value" and "Pass the current node's value as the new min_ value" and any mention
of the helper function—to ensure they render correctly and unambiguously.
In `@datastructures/trees/binary/search_tree/test_binary_search_tree_valid.py`:
- Around line 7-12: IS_VALID_BST_TEST_CASES currently uses level-order-looking
lists but the test harness builds trees by sequentially calling insert_node
(which ignores None), so the constructed trees differ; update the tests in
IS_VALID_BST_TEST_CASES and the test setup in test_binary_search_tree_valid.py
to either (A) treat the lists as insertion sequences by renaming the variable
and adding a brief comment/documentation that these are insert sequences for
insert_node, or (B) implement and call a level-order builder (e.g.,
build_tree_level_order or build_tree_from_level_list) that accepts None
placeholders and constructs the exact tree shape before calling is_valid_bst;
pick one approach and update references to insert_node/is_valid_bst accordingly
so the test input semantics match the construction method.
🧹 Nitpick comments (11)
algorithms/graphs/single_cycle_check/__init__.py (2)
14-17: Redundant negative index check in Python.In Python, the modulo operator with a positive divisor always returns a non-negative result (e.g.,
(-5) % 6 = 1). The conditional on line 17 is unnecessary sincenext_indexwill always be>= 0whenn > 0.♻️ Simplified version
def get_next_index(current_idx: int) -> int: jump = array[current_idx] next_index = (jump + current_idx) % n - return next_index if next_index >= 0 else next_index + n + return next_index
4-27: Consider handling empty array edge case.If
arrayis empty (n = 0), the function returnsTrue(sincecurrent_index == 0). This may be unexpected behavior depending on requirements. Consider adding an early return or validation.🛡️ Optional guard for empty input
def has_single_cycle(array: List[int]) -> bool: """ Checks if the provided array has a single cycle. Args: array (List[int]): list of integers. Returns: bool: True if the given array has a single cycle, False otherwise. """ n = len(array) + if n == 0: + return Falsealgorithms/graphs/single_cycle_check/test_single_cycle_check.py (2)
6-8: Consider adding more test cases for comprehensive coverage.Only one test case is provided. Consider adding edge cases such as:
- Array returning
False(e.g.,[1, 1, 1, 1, 1]forms a shorter cycle)- Single element array (
[0]should returnTrueif it stays in place)- All elements are same (
[1, 1]visits only even indices)📋 Additional test cases
SINGLE_CYCLE_CHECK_TEST_CASES = [ ([2, 3, 1, -4, -4, 2], True), + ([1, 1, 1, 1, 1], False), # Shorter cycle, doesn't visit all + ([1, -1], True), # Two elements forming a cycle + ([0], True), # Single element pointing to itself ]
1-4: Unused import:Listfrom typing.The
Listtype hint is imported but not strictly necessary sincearrayparameter is already annotated in the function signature being tested.🧹 Remove unused import
import unittest -from typing import List from parameterized import parameterized from algorithms.graphs.single_cycle_check import has_single_cyclealgorithms/dynamic_programming/frog_jump/__init__.py (1)
10-11: Hardcoded DP dimensions may fail for large inputs.The DP array is fixed at 2001×2001, which allocates ~16MB regardless of input size. While this matches typical LeetCode constraints (stones.length ≤ 2000), it will:
- Fail with IndexError if
n > 2001- Waste memory for small inputs
♻️ Consider dynamic sizing or bounds check
Option 1: Add a bounds check:
def can_cross(stones: List[int]) -> bool: n = len(stones) + if n > 2001: + raise ValueError("Input exceeds maximum supported size")Option 2: Size dynamically (trades off memory efficiency for flexibility):
- dp = [[0] * 2001 for _ in range(2001)] + dp = [[0] * (n + 1) for _ in range(n)]algorithms/graphs/network_delay_time/__init__.py (1)
1-4: Unused import:Set.
Setis imported fromtypingbut theverticesset that used it is dead code. After removing the unusedverticeslogic,Setcan also be removed from the import.algorithms/dynamic_programming/maximal_rectangle/__init__.py (1)
4-6: Consider adding a docstring for consistency.
maximal_rectangle_2andlargest_rectangle_areahave docstrings, butmaximal_rectangledoes not. Adding one would improve consistency and documentation.📝 Suggested docstring
def maximal_rectangle(matrix: List[List[int]]) -> int: + """ + Find the largest rectangle containing only 1s in a binary matrix. + + Uses dynamic programming with height, left, and right boundary arrays + to efficiently compute the maximal rectangle area in O(m*n) time. + + Args: + matrix: 2D binary matrix with 0s and 1s + + Returns: + Area of the largest rectangle containing only 1s + """ if not matrix: return 0algorithms/dynamic_programming/maximal_rectangle/test_maximal_rectangle.py (1)
9-76: Consider adding an empty matrix test case.The implementations handle empty matrices with
if not matrix: return 0, but there's no test case verifying this behavior. Adding([], 0)would ensure this edge case is covered.📝 Suggested addition
MAXIMAL_RECTANGLE_TEST_CASES = [ + ([], 0), # empty matrix ( [ [1, 0, 1, 0, 1],algorithms/backtracking/letter_tile_possibilities/__init__.py (2)
1-56: Reduce factorial memory by counting permutations instead of materializing them.
set(permutations(current))builds all permutations in memory for every subset. Since you already sketched the factorial-based approach, consider turning that into real helpers and using it to compute permutation counts directly (and remove the commented-out block).♻️ Suggested refactor
-from itertools import permutations - def num_tile_possibilities(tiles: str) -> int: # Store unique sequences to avoid duplicates unique_letter_sets: Set[str] = set() @@ def generate_sequences(current: str, index: int): @@ if current not in unique_letter_sets: unique_letter_sets.add(current) - total_permutations = len(set(permutations(current))) + total_permutations = count_permutations(current) return total_permutations return 0 @@ - # Optionally, you can write the count_permutations and the factorial method if not using builtin methods - # def factorial(n): - # - # # Computes the factorial of a given number. - # if n <= 1: - # return 1 - # - # result = 1 - # for num in range(2, n + 1): - # result *= num - # return result - # - # def count_permutations(sequence): - # - # # Calculates the number of unique permutations of a sequence, accounting for duplicate characters. - # permutations = factorial(len(sequence)) - # - # # Adjust for repeated characters by dividing by factorial of their counts - # for count in Counter(sequence).values(): - # permutations //= factorial(count) - # - # return permutations + def factorial(n: int) -> int: + result = 1 + for num in range(2, n + 1): + result *= num + return result + + def count_permutations(sequence: str) -> int: + counts = {} + for ch in sequence: + counts[ch] = counts.get(ch, 0) + 1 + total = factorial(len(sequence)) + for count in counts.values(): + total //= factorial(count) + return total
76-94: Optionally guard against non‑uppercase input to avoid IndexError.
If this function is used outside the problem constraints, a non‑A–Zcharacter yields a negative/overflow index. Consider a defensive check or a clearer error.🛡️ Possible defensive check
def num_tile_possibilities_with_optimized_recursion(tiles: str) -> int: char_count: List[int] = [0] * 26 for letter in tiles: + if letter < "A" or letter > "Z": + raise ValueError("tiles must contain only uppercase A-Z letters") char_count[ord(letter) - ord("A")] += 1algorithms/backtracking/letter_tile_possibilities/test_num_tile_possibilities.py (1)
9-17: Remove the duplicate test vector to reduce redundant runs.
The dataset includes the same("ABC", 15)case twice.✂️ Suggested change
LETTER_TILE_POSSIBILITIES = [ ("AAB", 8), ("ABC", 15), - ("ABC", 15), ("CDB", 15), ("ZZZ", 3), ("AAABBC", 188), ("V", 1), ]
| You are given a string, tiles, consisting of uppercase English letters. You can arrange the tiles into sequences of any | ||
| length (from 1 to the length of tiles), and each sequence must include at most one tile, tiles[i], from tiles. | ||
|
|
||
| Your task is to return the number of possible non-empty unique sequences you can make using the letters represented on | ||
| tiles[i]. | ||
|
|
||
| ## Constraints: | ||
|
|
||
| - 1 ≤ `tiles.length` ≤ 7 | ||
| - The `tiles` string consists of uppercase English letters. | ||
|
|
||
| ## Examples | ||
|
|
||
| Example 1: | ||
|
|
||
| ```text | ||
| Input: tiles = "AAB" | ||
| Output: 8 | ||
| Explanation: The possible sequences are "A", "B", "AA", "AB", "BA", "AAB", "ABA", "BAA". | ||
| ``` | ||
|
|
||
| Example 2: | ||
| ```text | ||
| Input: tiles = "AAABBC" | ||
| Output: 188 | ||
| ``` | ||
|
|
||
| Example 3: | ||
| ```text | ||
| Input: tiles = "V" | ||
| Output: 1 | ||
| ``` | ||
|
|
||
| ## Topics | ||
|
|
||
| - Hash Table | ||
| - String | ||
| - Backtracking | ||
| - Counting | ||
|
|
||
| ## Solutions | ||
|
|
||
| ### Permutations and Combinations | ||
|
|
||
| This problem would have been straightforward if the tiles had no duplicates and we only needed to find sequences of the | ||
| same length as the given tiles. In that case, we could simply calculate the total unique sequences using n! (where n is | ||
| the length of the tiles). | ||
|
|
||
|  | ||
|
|
||
| However, in this problem, we need to find all unique sequences of any length—from 1 to the full length of the tiles—while | ||
| accounting for duplicate tiles. To achieve this, we take the following approach: | ||
|
|
||
| We begin by sorting the letters in tiles so that similar letters are grouped. This allows us to systematically explore | ||
| possible letter sets without repeating unnecessary work. | ||
|
|
||
| To generate sequences of lengths ranging from 1 to n, we consider each letter one by one. We start with a single | ||
| character, determine its possible sequences, then move to two-character combinations, find their possible sequences, | ||
| and so on. For each new character, we have two choices: either include the current letter in our selection or skip it | ||
| and move to the next one. By exploring both choices, we generate all possible letter sets. | ||
|
|
||
| To prevent counting the same sequences multiple times due to duplicate letters, we ensure (in the previous step) that | ||
| duplicate letter sets are not counted more than once while generating different letter sets. Then, for each set, we | ||
| calculate how many distinct sequences can be formed by applying the updated formula for permutations, which accounts | ||
| for repeated letters: `n! / c!`, where c is the count of each repeated letter. | ||
|
|
||
| The more repeated letters in a set, the fewer unique ways it can be arranged. | ||
|
|
||
| We repeat this process for every possible letter set. Eventually, we sum the count of all these possible sequences to | ||
| obtain the total number of unique non-empty sequences. Finally, we subtract one from our final count to exclude the | ||
| empty set, as it does not contribute to valid arrangements. |
There was a problem hiding this comment.
Align the README with the actual implementation and clarify the formula.
The text describes a factorial/count‑based approach, while the current code counts permutations via set(permutations(...)). Either update the doc to mention that approach (and note its trade‑offs) or update code to match the described factorial method. Also, the sentence about tile usage is unclear, and the formula should indicate a product over repeated letter counts.
📝 Suggested wording tweaks
-You are given a string, tiles, consisting of uppercase English letters. You can arrange the tiles into sequences of any
-length (from 1 to the length of tiles), and each sequence must include at most one tile, tiles[i], from tiles.
+You are given a string, tiles, consisting of uppercase English letters. You can arrange the tiles into sequences of any
+length (from 1 to the length of tiles). Each tile may be used at most once in a sequence.
@@
-duplicate letters. To achieve this, we take the following approach:
+duplicate letters. To achieve this, we take the following approach (or alternatively count permutations directly, as in
+the current implementation):
@@
-calculate how many distinct sequences can be formed by applying the updated formula for permutations, which accounts
-for repeated letters: `n! / c!`, where c is the count of each repeated letter.
+calculate how many distinct sequences can be formed by applying the updated formula for permutations, which accounts
+for repeated letters: `n! / (c1! × c2! × …)`, where each ci is the count of a repeated letter.🤖 Prompt for AI Agents
In `@algorithms/backtracking/letter_tile_possibilities/README.md` around lines 3 -
73, The README currently describes a factorial/count-based method but the
implementation uses set(permutations(...)); either update the README to describe
the implemented approach (explicitly mention using Python's permutations and
de-duplication via set(permutations(...)) and its performance trade-offs) or
change the implementation to the factorial combinatorics approach; also reword
the ambiguous sentence "each sequence must include at most one tile, tiles[i]"
to "each tile may be used at most once per sequence", and correct the formula to
show the product over repeated letter counts as n! / (∏ c_i!) where c_i are
counts of each repeated letter.
| ### Space Complexity | ||
|
|
||
| The algorithm’s space complexity is `O(n)` occupied by the mapper map. |
There was a problem hiding this comment.
Space complexity statement may be incomplete.
The space complexity is listed as O(n) for the mapper, but the algorithm also uses a 2D dp array of size 1001 x 1001. If we consider the DP table, the space complexity should mention this fixed-size allocation (which could be considered O(1) constant or O(n²) depending on interpretation).
📝 Suggested clarification
### Space Complexity
-The algorithm's space complexity is `O(n)` occupied by the mapper map.
+The algorithm's space complexity is `O(n)` for the mapper map, plus `O(1)` for the fixed-size 2D DP array (1001 x 1001), assuming the constraints guarantee `n ≤ 1000`.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ### Space Complexity | |
| The algorithm’s space complexity is `O(n)` occupied by the mapper map. | |
| ### Space Complexity | |
| The algorithm's space complexity is `O(n)` for the mapper map, plus `O(1)` for the fixed-size 2D DP array (1001 x 1001), assuming the constraints guarantee `n ≤ 1000`. |
🤖 Prompt for AI Agents
In `@algorithms/dynamic_programming/frog_jump/README.md` around lines 82 - 84,
Update the Space Complexity section to mention both the mapper and the DP table:
state that the mapper/map uses O(n) extra space and explicitly call out the 2D
dp array (named dp) allocated as 1001 x 1001; note that this fixed-size table
can be described as O(1) space if treating the bounds as constants or O(n²) if
you view the table size as dependent on input magnitude—include both
interpretations and clarify which one the README adopts.
| for i in range(1, job_len + 1): | ||
| start, end, profit = jobs[i - 1] | ||
| # find number of jobs to finish before start of current job | ||
| num_jobs = bisect_right([job[1] for job in jobs], start) | ||
|
|
||
| dp[i] = max(dp[i - 1], dp[num_jobs] + profit) |
There was a problem hiding this comment.
Performance issue: Repeated list creation inside loop.
Line 18 creates a new list [job[1] for job in jobs] on every iteration, resulting in O(n²) time complexity instead of the intended O(n log n).
Extract the end times list once before the loop:
🐛 Proposed fix
jobs = sorted(zip(start_time, end_time, profit), key=lambda x: x[1])
job_len = len(jobs)
dp = [0] * (job_len + 1)
+ end_times = [job[1] for job in jobs]
for i in range(1, job_len + 1):
- start, end, profit = jobs[i - 1]
+ start, _, p = jobs[i - 1]
# find number of jobs to finish before start of current job
- num_jobs = bisect_right([job[1] for job in jobs], start)
+ num_jobs = bisect_right(end_times, start)
- dp[i] = max(dp[i - 1], dp[num_jobs] + profit)
+ dp[i] = max(dp[i - 1], dp[num_jobs] + p)
return dp[-1]Note: The fix also renames profit to p in the unpacking to avoid shadowing the function parameter.
🤖 Prompt for AI Agents
In `@algorithms/dynamic_programming/max_profit_in_job_scheduling/__init__.py`
around lines 15 - 20, Precompute the list of end times once before the loop and
use it in bisect_right to avoid rebuilding [job[1] for job in jobs] on every
iteration: create end_times = [job[1] for job in jobs] before the for i in
range(1, job_len + 1) loop and replace bisect_right([job[1] for job in jobs],
start) with bisect_right(end_times, start); also change the unpacking inside the
loop from "start, end, profit = jobs[i - 1]" to "start, end, p = jobs[i - 1]"
(and use p when updating dp) to avoid shadowing the function parameter named
profit.
| @@ -0,0 +1,116 @@ | |||
| # Maximum Profit in Job Scheduling | |||
There was a problem hiding this comment.
Fix heading spacing to satisfy markdownlint (MD019).
Use a single space after #.
📝 Suggested edit
-# Maximum Profit in Job Scheduling
+# Maximum Profit in Job Scheduling📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # Maximum Profit in Job Scheduling | |
| # Maximum Profit in Job Scheduling |
🧰 Tools
🪛 markdownlint-cli2 (0.20.0)
[warning] 1-1: Multiple spaces after hash on atx style heading
(MD019, no-multiple-space-atx)
🤖 Prompt for AI Agents
In `@algorithms/dynamic_programming/max_profit_in_job_scheduling/README.md` at
line 1, The top-level markdown heading "# Maximum Profit in Job Scheduling" has
two spaces after the hash which violates MD019; change it to a single space by
updating the heading to "# Maximum Profit in Job Scheduling" (locate the header
text "Maximum Profit in Job Scheduling" in README.md and remove the extra space
after the '#' so there is exactly one space).
| - Otherwise, the `matrix[i][j]` is ‘0’: | ||
| - `right[j]` is reset to ~, and `current_right` is updated to `j`. |
There was a problem hiding this comment.
Typo: ~ should be n.
The text says "right[j] is reset to ~" but based on the implementation, it should be reset to n (or cols).
📝 Proposed fix
- Otherwise, the `matrix[i][j]` is '0':
- - `right[j]` is reset to ~, and `current_right` is updated to `j`.
+ - `right[j]` is reset to `n`, and `current_right` is updated to `j`.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - Otherwise, the `matrix[i][j]` is ‘0’: | |
| - `right[j]` is reset to ~, and `current_right` is updated to `j`. | |
| - Otherwise, the `matrix[i][j]` is '0': | |
| - `right[j]` is reset to `n`, and `current_right` is updated to `j`. |
🤖 Prompt for AI Agents
In `@algorithms/dynamic_programming/maximal_rectangle/README.md` around lines 69 -
70, The README contains a typo: when matrix[i][j] is '0' the text says "right[j]
is reset to ~" but the implementation uses n (or cols); update the sentence to
state that right[j] is reset to n (or cols) and current_right is set to j,
referencing the variables right[j], current_right, matrix[i][j], and n/cols so
it matches the algorithm implementation.
| ( | ||
| "times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=1", | ||
| [[1, 2, 1], [2, 3, 2], [3, 4, 3], [4, 1, 4]], | ||
| 4, | ||
| 2, | ||
| 9, | ||
| ), |
There was a problem hiding this comment.
Test case description mismatch.
The description string says k=1 but the actual k parameter passed is 2. This could cause confusion when debugging test failures.
📝 Proposed fix
(
- "times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=1",
+ "times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=2",
[[1, 2, 1], [2, 3, 2], [3, 4, 3], [4, 1, 4]],
4,
2,
9,
),📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ( | |
| "times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=1", | |
| [[1, 2, 1], [2, 3, 2], [3, 4, 3], [4, 1, 4]], | |
| 4, | |
| 2, | |
| 9, | |
| ), | |
| ( | |
| "times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=2", | |
| [[1, 2, 1], [2, 3, 2], [3, 4, 3], [4, 1, 4]], | |
| 4, | |
| 2, | |
| 9, | |
| ), |
🤖 Prompt for AI Agents
In `@algorithms/graphs/network_delay_time/test_network_delay_time.py` around lines
39 - 45, The test case tuple's human-readable description
("times=[[1,2,1],[2,3,2],[3,4,3],[4,1,4]], n=4, k=1") does not match the actual
k argument (2); update the description string to use k=2 so it accurately
reflects the inputs in the tuple (the entry containing [[1, 2, 1], [2, 3, 2],
[3, 4, 3], [4, 1, 4]], 4, 2, 9), or alternatively change the numeric k to 1 if
the intended case was k=1—ensure the text and the value in the test tuple (the
test case tuple in test_network_delay_time.py) are consistent.
| - Worst case relation to n: If there are n Add Num(int value) calls and nothing ever merges, then k=`O(n)`, giving | ||
| Add Num(int value) `O(log(n))` and Get Intervals() `O(n)`. |
There was a problem hiding this comment.
Hyphenate “Worst-case” in the complexity note.
Minor grammar fix for clarity.
🔧 Suggested edit
-- Worst case relation to n: If there are n Add Num(int value) calls and nothing ever merges, then k=`O(n)`, giving
+- Worst-case relation to n: If there are n Add Num(int value) calls and nothing ever merges, then k=`O(n)`, giving 📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - Worst case relation to n: If there are n Add Num(int value) calls and nothing ever merges, then k=`O(n)`, giving | |
| Add Num(int value) `O(log(n))` and Get Intervals() `O(n)`. | |
| - Worst-case relation to n: If there are n Add Num(int value) calls and nothing ever merges, then k=`O(n)`, giving | |
| Add Num(int value) `O(log(n))` and Get Intervals() `O(n)`. |
🧰 Tools
🪛 LanguageTool
[grammar] ~114-~114: Use a hyphen to join words.
Context: ...erval once to build the output. - Worst case relation to n: If there are n Add N...
(QB_NEW_EN_HYPHEN)
🤖 Prompt for AI Agents
In `@algorithms/intervals/data_stream/README.md` around lines 114 - 115, Update
the complexity note header by hyphenating "Worst-case" (change "Worst case
relation to n:" to "Worst-case relation to n:") in the README line that
discusses Add Num(int value) and Get Intervals() complexity so the grammar is
correct; leave the rest of the sentence and references to Add Num and Get
Intervals() unchanged.
| def dfs(node: Optional[BinaryTreeNode], min_value: T, max_value: T) -> bool: | ||
| if not node: | ||
| return True | ||
|
|
||
| # Is the current node's value within the given range? | ||
| if node.data <= min_value or node.data >= max_value: | ||
| # If not, return False immediately | ||
| return False | ||
|
|
||
| return dfs(node.left, min_value, node.data) and dfs( | ||
| node.right, node.data, max_value | ||
| ) | ||
|
|
||
| return dfs(root, float("-inf"), float("inf")) |
There was a problem hiding this comment.
Avoid float sentinels for generic node values.
Using float("-inf")/float("inf") will crash for non-numeric T (e.g., strings). Prefer optional bounds and guard comparisons.
✅ Suggested fix
- def dfs(node: Optional[BinaryTreeNode], min_value: T, max_value: T) -> bool:
+ def dfs(
+ node: Optional[BinaryTreeNode],
+ min_value: Optional[T],
+ max_value: Optional[T],
+ ) -> bool:
if not node:
return True
# Is the current node's value within the given range?
- if node.data <= min_value or node.data >= max_value:
+ if (min_value is not None and node.data <= min_value) or (
+ max_value is not None and node.data >= max_value
+ ):
# If not, return False immediately
return False
return dfs(node.left, min_value, node.data) and dfs(
node.right, node.data, max_value
)
- return dfs(root, float("-inf"), float("inf"))
+ return dfs(root, None, None)🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/bst_utils.py` around lines 17 - 30,
The current dfs uses float("-inf")/float("inf") as sentinels which fails for
non-numeric T; change dfs to accept Optional[T] bounds (e.g., dfs(node:
Optional[BinaryTreeNode], min_value: Optional[T], max_value: Optional[T]) ->
bool) and replace comparisons with guarded checks (if min_value is not None and
node.data <= min_value: return False; if max_value is not None and node.data >=
max_value: return False), recurse as dfs(node.left, min_value, node.data) and
dfs(node.right, node.data, max_value), and start with dfs(root, None, None).
| This helper function will introduce two parameters, min_ and max_, which represent the range of values that the current | ||
| subtree's nodes can take on. The helper function will return a boolean value indicating whether the current subtree is | ||
| a valid binary search tree. | ||
|
|
||
| When we recurse to our left child, we: | ||
| - Pass the current node's value as the new max_ value, since the left child's value must be less than the current node's | ||
| value. min_ remains the same. | ||
|
|
||
| When we recurse to our right child, we: | ||
| - Pass the current node's value as the new min_ value, since the right child's value must be greater than the current | ||
| node's value. max_ remains the same. |
There was a problem hiding this comment.
Markdown emphasis markers may cause rendering issues.
The use of min_ and max_ with underscores could be interpreted as emphasis markers by some Markdown renderers when followed/preceded by spaces. Consider using backticks for inline code.
📝 Suggested fix
-This helper function will introduce two parameters, min_ and max_, which represent the range of values that the current
+This helper function will introduce two parameters, `min_` and `max_`, which represent the range of values that the current
subtree's nodes can take on. The helper function will return a boolean value indicating whether the current subtree is
a valid binary search tree.
When we recurse to our left child, we:
-- Pass the current node's value as the new max_ value, since the left child's value must be less than the current node's
- value. min_ remains the same.
+- Pass the current node's value as the new `max_` value, since the left child's value must be less than the current node's
+ value. `min_` remains the same.
When we recurse to our right child, we:
-- Pass the current node's value as the new min_ value, since the right child's value must be greater than the current
- node's value. max_ remains the same.
+- Pass the current node's value as the new `min_` value, since the right child's value must be greater than the current
+ node's value. `max_` remains the same.🧰 Tools
🪛 markdownlint-cli2 (0.20.0)
[warning] 453-453: Spaces inside emphasis markers
(MD037, no-space-in-emphasis)
[warning] 458-458: Spaces inside emphasis markers
(MD037, no-space-in-emphasis)
[warning] 462-462: Spaces inside emphasis markers
(MD037, no-space-in-emphasis)
🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/README.md` around lines 453 - 463,
The README text uses min_ and max_ which can be parsed as emphasis in some
Markdown renderers; update the wording to wrap the parameter names in inline
code formatting (e.g., `min_` and `max_`) wherever they appear—including the
phrases "Pass the current node's value as the new max_ value" and "Pass the
current node's value as the new min_ value" and any mention of the helper
function—to ensure they render correctly and unambiguously.
| IS_VALID_BST_TEST_CASES = [ | ||
| ([8, 3, 10, 1, 6], True), | ||
| ([2, 1, 4], True), | ||
| ([1, None, 1], False), | ||
| ([6, 2, 8, None, None, 7, 11], True), | ||
| ] |
There was a problem hiding this comment.
Test case format may be misleading.
The test cases appear to use level-order tree notation (e.g., [1, None, 1]), but the test inserts values sequentially using insert_node, which skips None values. The actual trees built may differ from what the notation suggests.
For example, [6, 2, 8, None, None, 7, 11] would insert [6, 2, 8, 7, 11] in BST order, not construct the level-order tree.
Consider either:
- Documenting that these are insertion sequences, not level-order representations
- Using a level-order tree construction method instead
🤖 Prompt for AI Agents
In `@datastructures/trees/binary/search_tree/test_binary_search_tree_valid.py`
around lines 7 - 12, IS_VALID_BST_TEST_CASES currently uses level-order-looking
lists but the test harness builds trees by sequentially calling insert_node
(which ignores None), so the constructed trees differ; update the tests in
IS_VALID_BST_TEST_CASES and the test setup in test_binary_search_tree_valid.py
to either (A) treat the lists as insertion sequences by renaming the variable
and adding a brief comment/documentation that these are insert sequences for
insert_node, or (B) implement and call a level-order builder (e.g.,
build_tree_level_order or build_tree_from_level_list) that accepts None
placeholders and constructs the exact tree shape before calling is_valid_bst;
pick one approach and update references to insert_node/is_valid_bst accordingly
so the test input semantics match the construction method.
Describe your change:
Tree traversal algorithms
Checklist:
Fixes: #{$ISSUE_NO}.Summary by CodeRabbit
New Features
Bug Fixes / Cleanup
Refactor