Back
Neural Architecture Search (NAS) Overview
webautoml.org·automl.org/nas-overview/
Relevant to AI safety discussions about automated capability improvement and recursive self-improvement; NAS is an example of AI systems assisting in the design of more capable AI architectures, raising questions about automation of AI development pipelines.
Metadata
Importance: 35/100documentationeducational
Summary
An overview of Neural Architecture Search (NAS), a subfield of AutoML that automates the design of neural network architectures. It covers the key methods, search spaces, and optimization strategies used to automatically discover high-performing architectures, reducing the need for manual human design.
Key Points
- •NAS automates the discovery of optimal neural network architectures, replacing labor-intensive manual design by human experts.
- •Key components include defining a search space, a search strategy (e.g., reinforcement learning, evolutionary algorithms, gradient-based), and a performance estimation strategy.
- •NAS methods can find architectures that outperform manually designed ones on benchmarks like image classification and language modeling.
- •Computational cost is a major challenge; early NAS methods required thousands of GPU hours, though newer approaches like weight sharing have reduced this significantly.
- •NAS represents a form of automated capability improvement relevant to AI safety discussions about recursive self-improvement and autonomous AI development.
Cited by 2 pages
| Page | Type | Quality |
|---|---|---|
| Self-Improvement and Recursive Enhancement | Capability | 69.0 |
| Novel / Unknown Approaches | Capability | 53.0 |
Cached Content Preview
HTTP 200Fetched Mar 20, 202611 KB
Manage Cookie Consent
We use cookies to optimize our website and our service.
FunctionalFunctional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
PreferencesPreferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
StatisticsStatistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
3rd Party3rd Party
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
[Manage options](https://www.automl.org/privacy-policy/#cmplz-manage-consent-container) [Manage services](https://www.automl.org/privacy-policy/#cmplz-cookies-overview) [Manage {vendor\_count} vendors](https://www.automl.org/privacy-policy/#cmplz-tcf-wrapper) [Read more about these purposes](https://cookiedatabase.org/tcf/purposes/)
Accept allDismissPreferencesSave preferences [Preferences](https://www.automl.org/privacy-policy/#cmplz-manage-consent-container)
[Privacy Policy](https://www.automl.org/privacy-policy/) [Privacy Policy](https://www.automl.org/privacy-policy/) [Impressum](https://ml.informatik.uni-freiburg.de/impressum/)
# Neural Architecture Search

Neural Architecture Search (NAS) automates the process of architecture design of neural networks. NAS approaches optimize the topology of the networks, incl. how to connect nodes and which operators to choose. User-defined optimization metrics can thereby include accuracy, model size or inference time to arrive at an optimal architecture for specific applications. Due to the extremely large search space, traditional evolution or reinforcement learning-based AutoML algorithms tend to be computationally expensive. Hence recent research on the topic has focused on exploring more efficient ways for NAS. In particular, recently developed gradient-based and multi-fidelity methods have provided a promising path and boosted research in these directions. Our group has been very active in developing state of the art NAS methods and has been at the forefront of driving NAS research forward. We give a summary of a few recent important work released from our group –
## Selected NAS Papers
### Literature Over
... (truncated, 11 KB total)Resource ID:
d01d8824d9b6171b | Stable ID: NGE3ODdjZT