|
LearningJuly 1, 2018
I work in the field of natural language understanding and in the last few years I've figured out how to use that very advantageously wrt learning. Here's the approach I use: | Every time you come across a new term or concept, you create a new "notebook"/document. That's right, one notebook per concept. The title of the notebook is the name of the concept. |
| You create a summary for the concept using bullet points. What you're trying to maximize with this set of bullet points is the speed at which, in the future, you can re-read them and achieve a similar brain state to what you had when you originally learned the concept. |
| You can then obviously have extended notes below that where you go into more detail. |
| Then, crucially, you create something akin to a regex that will allow you to quickly and unambiguously look the concept up in the future. If you just learned what a rectified linear unit is, your pattern might simply be: rlu | (rectified linear unit) |
| You then have a hotkey on your computer -- I use Ctrl-Q, that brings up a text box where you can type the name of the concept you want to bring up (ex. "rlu"). When you press ENTER, it doesn't give you search results if there's an exact match, but instead directly opens the document and makes it instantly viewable / editable. |
| As your concept graph starts to grow, you have links within your notebooks to related concepts as they're referenced. |
| Each time you read an article that is important to your understanding of a concept, you quickly open up that notebook and add that article, and perhaps one or two bullet points that contain the key things you learned that expanded your sense of that concept. |
| This same system can be used for more than learning text book information. You can use it if your a project manager to keep tabs on the millions of things you have to juggle, you can have notebooks for people, for lists, and you can have "regexes" for "programs"/scripts, for web pages, for files/directories, etc, etc. |
| More general than "regexes" are context free grammars. In this context what that means is the ability to have named "subroutines" for your regexes. For example, if you end up using RLU as a sub-part of a lot of other notebook regexes, then you might define $rlu to be a short form for (rlu | (rectified linear unit)). |
I'm also a person that loves exploring "idea space", especially as it relates to understanding intelligence, machine learning, etc. I've been using this approach the last few months for this as well -- any time I have an "aha" moment about a concept, or about how two concepts are related, I quickly flip open the appropriate notebook and add my idea. When Elon talks about learning, he will sometimes talk about it being easy because you just hang a new piece of information on "the tree". What I suspect is that Elon's knowledge tree doesn't crumble as fast as my own. Most times when I learn a new concept or facet of a concept, I will forget it quite quickly. Now that I have a knowledge tree in digital form, I really do have a place to "hang" new bits of knowledge, and they don't get lost. Hopefully this is a more scalable approach to learning. Alternating Analog/DigitalJanuary 23, 2018
I once heard it described that the universe seems to be viewable through both an "analog"/continuous lens as well as a "digital"/discrete lens. And furthermore, that as you zoom out from the minute, there can be a kind of back-and-forth between analog and digital models making the most sense. It makes me wonder whether computation will evolve in a similar way. Maybe our deep neural nets will evolve into systems that mix and match continuous and discrete layers, giving rise to a synergy between the two that takes its abilities to the next level. That's a pretty raw and not-thought-through idea, but it has a certain ring to it for me... Tech WatchDecember 29, 2017
As a technology person, I find it fun to watch things progress. Here's a list of various things that I'm looking forward to watching in the coming year(s): | Falcon Heavy launch, especially the simultaneous landing of its two side cores back on land. |
| The Boring Company |
| Falcon 9 launch cadence |
| Crew Dragon's unmanned launch |
| Crew Dragon's crewed launch |
| Tesla Model 3's production ramp-up |
| Falcon 9 fairing recovery and reuse |
| Autonomous Vehicles (Including, hopefully, the "coast to coast" drive from LA to NYC using Autopilot as promised by Elon) |
| Machine Learning / Deep Learning / GPUs |
| | DeepMind's efforts to win at Starcraft |
| Amazon |
| Automated package delivery |
| Advanced automation, "lights out" factories |
| Tesla Semi |
| Solar roofs |
| Gigafactory construction and production ramp |
| Hyperloop |
| Blue Origin New Shepherd |
| Air taxis, autonomous aircraft, electric aircraft |
| EV growth |
| Blue Origin New Glenn |
| Supercharger Network growth |
| Elon's goal of "reflight within 24 hours" |
| "Tesla Network" (autonomous ridesharing fleet) |
| Tesla Powerwall / Powerpack |
| Solar prices, Battery prices |
| SpaceX's Boca Chica launch site |
| CPU lithography: 10 nm, 7 nm, 5 nm, etc. |
| SSD sizes / speeds |
| SpaceX's Mars ambitions. |
| Elon's Neuralink efforts |
| Possibility of space tourists during a lunar flyby in 2019 |
| Genetics / CRISPR / Genome Write project / cures for diseases | older >>
|
|
|
|
|
|