MLP Blog

A blog for MLP. MLP stands for Model, Language and Philosophy, NOT Multi-Layer Perceptron (MLP) 😛

My name is Yisong. I am a junior Ph.D. student studying Natural Language Processing.


Most of my current works study limited datasets (annotated by limited human) in a limited form (mostly tensors) by limited models (mostly invented in the last 5 years).

How do my current works relate to classic math / ML / AI models (perhaps invented 20 years ago)?

How do my current works relate to classic linguistic theory?

How do my current works implicate philosophical views?


I expect an update frequency around once a month.

I have come with the titles for the first few blogs:

Blog 0.1: On MaxProb method in NLP.

One liner summary: A request by Prof Min to present in a future group meeting, so I decide to write up a blog/PDF for clarity. MaxProb describes the confidence of a model in prediction. Tutorial statement: MaxProb is good enough, but it can be better calibrated and adapted.

Status: Done on Nov. 19th 2021 🎉.


Blog 0.2: What do you mean by understand?

One liner summary: This is the one million dollar question my advisor asked me during a research meeting. He compared a neural network with a human with autism. He advised me to find smart proxy to measure understanding.

Status: In Progress.

Update: Jan 16th. I will write around the keyword of “decompose”.


Blog 0.3: When are modular neural networks better than unstructured neural networks?

One liner summary: I am curious!

Status: Not started.

Blog 0.4: Entity and relation embeddings.

One liner summary: Knowledge base is not my research focus. But the math behind these methods are facinating. So I decide to write up a technical note for clarity.

Status: Not started.

Blog 1: When a junior NLPer reads Ludwig Wittgenstein.

Blog 2: On information bottleneck with my tasks.

Concept Bottleneck Models. ICML ‘20.

Blog 3: On systematic generalization with my tasks (starting from F&P 1988).

Others’ Blogs

While I am busy writing (procrastinating) my future articles, others’ blogs might fill this void: