Category: AI
-
DL implementation training – Jan 12, 2024
After a week of absence, I am finally back here writing. I was in contact with someone helpful and important who is already doing my dream research and is willing to guide me through my system 2 journey. And I was planning my next chapter of study, which is planned to last 3 months (or…
-
What does Bengio have to say about consciousness? – Jan 3-5, 2024
I have studied the science of consciousness, particularly GWT, and the influence of that on constructing system 2 AI. The remaining perspective I haven’t examine is the AI scientists’ ideas on consciousness. I asked the question: if system 2 AI is successful, could it be used to simulate consciousness? I have already got a preliminary…
-
Gap in System 2 AI formulation, Dec 27-29, 2023
Bengio’s system 2 AI and GFlowNet took inspirations from Bernard Baars’ Global Workspace Theory. Mila researchers take the “limited capacity” element from GWT, and posit that high-level thoughts are a necessity and constructed in a sequential manner using a small number of discrete concepts due to the biological bottleneck. However, such a position increasingly looks…
-
Consciousness, System 2 and AI – Christmas holiday 2023
This post discusses phenomena in a general manner and terms like “subconscious“, “cognitive biases” are used in a way accepted in popular literature. I was imagining my conversation with Justin, my skip manager, about what I am studying. I would say to him: “next time when you make a judgement or come up with an…
-
GFlowNet Study – Dec 19, 2023
What is energy-based modeling? My answer after brief reading: the fundamental idea of EBM is to interpret a system in terms of energy. The energy function, which outputs a scalar value, defines a system’s state. The lower the energy, the more stable, desirable or likely the system. In optimization, the energy function is similar to…
-
GFlowNet Study, Dec 18, 2023
Note: a trained GFN is both a sampler (i.e., generating compositional objects) and an inference machine (i.e., answering questions and predicting probabilities) How does a parametrized energy function look like? TO_BE_ANSWERED. But the training of it can be done with classical maximum likelihood. we have shown how we can jointly train the GFlowNet sampler and…
-
Consciousness Prior Study – Dec 16, 2023
This study is to help me understand this statement better: The stochastic selection of just a few elements of content (that go into a thought) make GFlowNets a good candidate to implement the “consciousness priors”. In particular, the GWT bottleneck, when applied to such probabilistic inference, would enforce the inductive bias that the graph of…
-
GFlowNet Study – Dec 14, 2023
The kind of sequential probabilistic inference that GFlowNets can perform is a powerful form of learned reasoning machinery, which could be used for interpretation of sensory inputs, interpretation of selected past observations, planning, and counterfactuals https://milayb.notion.site/The-GFlowNet-Tutorial-95434ef0e2d94c24aab90e69b30be9b3#208ee566b55048cda2c87fd5e0e93330 These are the potential applications of GFN. However, I think if they are all achieved then AGI is essentially…
-
GFlowNet Study – Dec 13, 2023
Note: updated mental map of GFlowNet How does GFlowNet approximate human reasoning? (high level, from Bayesian and Variational inference perspectives) Note: training GFlowNet What does amortization mean in general CS and under the context of GFlowNet? My answer before reading: amortization in CS means spreading the computational cost among a series of computations. An example…
-
GFlowNet Study – Dec 12, 2023
In regular GFlowNets, choosing a_t from s_t deterministically yields some s_{t+1}, which means that we can also write \pi(a_t|s_t)=P_F(s_{t+1}|s_t) for that policy Doesn’t choosing a_t from s_t deterministically mean P_F(s_{t+1}|s_t)=1? What do we still need \pi(a_t|s_t)=P_F(s_{t+1}|s_t)? My answer with ChatGPT help: \pi(a_t|s_t) remains a stochastic policy given P_F. The deterministic element is instead the equation…