I am trying to find the minimum depth of a binary tree; however, my test case in example 5 fails. I am not sure of the flaw in my logic to make this work for all test cases. An example of what I am doing is as follows:

``````Example:
Given binary tree [3,9,20,null,null,15,7],

3
/ \
9  20
/  \
15   7

return its minimum:
depth = 2
``````

I have the following code to accomplish this:

``````class TreeNode {
constructor(val) {
this.val = val;
this.left = this.right = null;
}
}

const minDepth = root => {
if (!root) return 0

const traverse = root => {
let counter = 1
if (!root) return counter
let current
let queue = [root, 's']

while (queue.length > 1) {
current = queue.shift()
if (current === 's') counter++, queue.push('s')
if (!current.left && !current.right) return counter
else {
if (current.left) queue.push(current.left)
if (current.right) queue.push(current.right)
}
}
return counter
}
return root.left && root.right ? Math.min(traverse(root.left), traverse(root.right)) + 1 : traverse(root)
}

//example 1
const tree1 = new TreeNode(3)
tree1.left = new TreeNode(9)
tree1.right = new TreeNode(20)
tree1.right.left = new TreeNode(15)
tree1.right.right = new TreeNode(7)

//example 2
const tree2 = new TreeNode(1)
tree2.left = new TreeNode(2)
tree2.right = new TreeNode(3)
tree2.left.left = new TreeNode(4)
tree2.right.right = new TreeNode(5)

//example 3
const tree3 = new TreeNode(0)

//example 4
const tree4 = new TreeNode(1)
tree4.left = new TreeNode(2)

//example 5 not working
const tree5 = new TreeNode(1)
tree5.left = new TreeNode(2)
tree5.left.right = new TreeNode(3)
tree5.left.right.right = new TreeNode(4)
tree5.left.right.right.right = new TreeNode(5)

console.log(minDepth(tree1))
console.log(minDepth(tree2))
console.log(minDepth(tree3))
console.log(minDepth(tree4))
console.log(minDepth(tree5))
``````

Any thoughts as to what I am missing?