DevilKing's blog

冷灯看剑,剑上几分功名?炉香无需计苍生,纵一穿烟逝,万丈云埋,孤阳还照古陵

0%

2020年,有如末日一般地过去了

原文链接

we show that such algorithms are possible via Generative Teaching Networks (GTNs).

That allowed us to search for new neural network architectures nine times faster than when using real data.

The architecture of a neural network refers to some of its design choices (e.g., how many layers it should have, how many neurons should be in each layer, which layers should connect to which, etc.).

We instead asked whether the process could be accelerated by a more radical idea: allowing machine learning to create the training data itself. 训练速度提不上去,采用一种造数的方式来自我加速

GTNs involve an exciting type of machine learning called meta-learning, here harnessed for architecture search.

We adopt the ideas of numerous papers that search for a small architectural module that is then repeatedly combined through a predetermined blueprint to create architectures of various sizes. Once a high-quality module is discovered, it can be used to create a larger network, which is then trained on real data to convergence for the target task. 采用不同的角度去生成数据,然后找到一个比较高质量的,然后再灌入最终的真实数据来收敛

However, generating problems requires defining an environment search space, meaning a way to encode a rich space of environments to search through. 需要确定搜索空间

GTNs could help us move towards AI-generating algorithms that automatically create powerful forms of AI by (1) meta-learning architectures, (2) meta-learning the learning algorithms themselves, and (3) automatically generating training environments.

原文链接

first principle:不基于任何假设

shared something:

JIT Compiler

Relational Algebra

All-in-four Math ->(map), + (union/agg), *(join), <>(sort/top)

简化SQL部分的语法糖

打满throughput,内存带宽跑满的话,内存的使用率?

rust share nothing的问题

repo地址

image-20210115154107820

从vfs部分入手

vfs.go,主要是posix的相关操作,代码里的默认实现,通过redisMeta来实现