2020年,有如末日一般地过去了
Shasta Intro
GTN Intro
we show that such algorithms are possible via Generative Teaching Networks (GTNs).
That allowed us to search for new neural network architectures nine times faster than when using real data.
The architecture of a neural network refers to some of its design choices (e.g., how many layers it should have, how many neurons should be in each layer, which layers should connect to which, etc.).
We instead asked whether the process could be accelerated by a more radical idea: allowing machine learning to create the training data itself. 训练速度提不上去,采用一种造数的方式来自我加速
GTNs involve an exciting type of machine learning called meta-learning, here harnessed for architecture search.
We adopt the ideas of numerous papers that search for a small architectural module that is then repeatedly combined through a predetermined blueprint to create architectures of various sizes. Once a high-quality module is discovered, it can be used to create a larger network, which is then trained on real data to convergence for the target task. 采用不同的角度去生成数据,然后找到一个比较高质量的,然后再灌入最终的真实数据来收敛
However, generating problems requires defining an environment search space, meaning a way to encode a rich space of environments to search through. 需要确定搜索空间
GTNs could help us move towards AI-generating algorithms that automatically create powerful forms of AI by (1) meta-learning architectures, (2) meta-learning the learning algorithms themselves, and (3) automatically generating training environments.
Tensorbase intro
first principle:不基于任何假设
shared something:
JIT Compiler
Relational Algebra
All-in-four Math ->(map), + (union/agg), *(join), <>(sort/top)
简化SQL部分的语法糖
打满throughput,内存带宽跑满的话,内存的使用率?
rust share nothing的问题