NLP Road Map


  • The relationship among keywords could be interpreted in ambiguous ways since they are represented in the format of a semantic mind-map. Please just focus on KEYWORD in square box, and deem them as the essential parts to learn.
  • The work of containing a plethora of keywords and knowledge within just an image has been challenging. Thus, please note that this roadmap is one of the suggestions or ideas.
  • You are eligible for using the material of your own free will including commercial purpose but highly expected to leave a reference (
Probability and Statistics
Continue reading

Something interesting

不管哪个领域,都可以在上升期做科研,在平稳期做业务,在饱和期做教育,显然Andrew Ng是个明白人。

No matter what field you are in, you can do research in the growing period, dedicate into the industry in the steady period, and develop education in the saturation period. Obviously, Andrew Ng is a sensible person.





To be honest, we have a decent job, house, car at the age of 25, should not complain more. However, cars/houses/good jobs, all of them, are general commodities, others may have them of ten times or even of hundred times than of what I have. But the ten years of youth, everyone has only one time.

So, please remember this thing, when you are 26 years old and decide whether you want to be a 30-year-old Doctor.


import numpy as np
import networkx as nx
from functools import reduce
import matplotlib.pyplot as plt

connect_graph = np.array([[0, 1, 0, 0, 0], 
                          [0, 0, 0, 1, 0], 
                          [0, 0, 0, 1, 0], 
                          [0, 0, 0, 0, 1], 
                          [0, 0, 1, 0, 0]])

def ring_add(a, b):
    return a or b

def ring_multi(a, b):
    return a and b

def dot_product(i, j):
    row = connect_graph[i]
    column = connect_graph[:,j]
    return reduce(ring_add, [ring_multi(a, b) for a, b in zip(row, column)])

def next_generation(connect_graph):
    candidate_number = connect_graph.shape[0]

    new_connect_graph = np.zeros((candidate_number, candidate_number))

    for i in range(candidate_number):
        for j in range(candidate_number):
            new_connect_graph[i][j] = dot_product(i,j)
    return new_connect_graph

new_connect_graph = next_generation(connect_graph)

def draw_graph(connect_graph):
    G = nx.DiGraph()
    candidate_number = connect_graph.shape[0]
    node_name = list(range(candidate_number))
    for i in range(candidate_number):
        for j in range(candidate_number):
            if connect_graph[i][j]:
                G.add_edge(i, j)

    nx.draw(G, with_labels=True)


Variational inference for Bayes Network

In general neural networks have a sort of loss like that:

However, The part of the denominator integral is intractable of finding an analytic solution solution in practice. Therefore, we are going to make a distribution approaching the original distribution. KL divergence can be used to indicate the difference between these two distributions.