# Variational inference for Bayes Network

In general neural networks have a sort of loss like that:

However, The part of the denominator integral is intractable of finding an analytic solution solution in practice. Therefore, we are going to make a distribution approaching the original distribution. KL divergence can be used to indicate the difference between these two distributions.

We fall,
We break,
We fail,

But then,

We rise,
We heal,
We overcome.

# Printing a pyramid matrix

How to print a pyramid matrix like that:

`n = 2[1, 1, 1][1, 2, 1][1, 1, 1]n = 3[1, 1, 1, 1][1, 2, 2, 1][1, 2, 2, 1][1, 1, 1, 1]n = 4[1, 1, 1, 1, 1][1, 2, 2, 2, 1][1, 1, 3, 2, 1][1, 2, 2, 2, 1][1, 1, 1, 1, 1]`
```def func(N):
N += 1
matrix = [[1 for _ in range(N)] for _ in range(N)]
cnt = 0

while cnt < N:
# UP
for i in range(cnt, N - cnt - 1):
matrix[cnt][i] = cnt + 1

# RIGHT
for i in range(cnt, N - cnt - 1):
matrix[i][N - cnt - 1] = cnt + 1

# DOWN
for i in range(N - cnt - 1, cnt, -1):
matrix[N - cnt - 1][i] = cnt + 1

# LEFT
for i in range(N - cnt, cnt, -1):
matrix[N - cnt - 1][cnt] = cnt + 1

cnt += 1

return matrix

if __name__ == "__main__":
matrix = func(N=4)

for line in matrix:
print(line)
```

Example:

```Input: 1->2->3->4->5->NULL
Output: 5->4->3->2->1->NULL
```

A linked list can be reversed either iteratively or recursively. Could you implement both?

As you can seen that recursion implementation is pretty easy to achieve, but iteratively achievement might not. Above are two implementations.

```# iteratively
class Solution(object):
"""
:rtype: ListNode
"""
return None

while pionner.next:
pionner.next = pionner.next.next

``# recursively``

# Keep going

Listen, smile, agree, and then do whatever the fuck you were gonna do anyway.

Cheers