Navigation
index
modules
|
next
|
neon 0.8.1 documentation
»
neon: A highly configurable deep learning framework
¶
Release:
0.8.1
Date:
May 06, 2015
Contents
¶
Quick start
Getting started
Machine learning OPerations (MOP) Layer
Upcoming libraries
License
Installation
Overview
Required Dependencies
Optional Dependencies
Configuration Setup
Installing MPI on an Ubuntu cluster (for distributed models)
Virtualenv
Upgrading
Uninstalling
Using neon
Running
Key neon Command-line Arguments
Experiment File Format
Parallelization
Training Models and Learning Parameters
Generating Predictions
Reporting Performance
Distributed Implementations using MPI
Available Models
Examples
Hyperparameter optimization
Installing our Spearmint Fork
YAML file changes
Experiment Initialization
Running
Contributing to the Framework
neon Contribution Process
Architecture
Extending the Framework
ML OPerational Layer (MOP) API
neon.backends.backend.Tensor
neon.backends.backend.Backend
Basic Data Structure
Arithmetic Operation Support
Logical Operation Support
Summarization Operation Support
Initialization and Setup
Higher Level Operation Support
MOP API Changes
v0.9.0
v0.8.0
v0.7.0
Backends
Current Implementations
Adding a new Backend
Models
Available Models
Adding a new type of Model
Layers
Available Layers
Adding a new type of Layer
Transforms
Available Transforms
Metrics
Reporting Metric Values
Available Metrics
Learning Rules
Available Learning Rules
Adding a new type of Learning Rule
Datasets
Available Datasets
Basic YAML Parameters
Adding a new type of Dataset
Working with Imageset
Experiments
Current Implementations
Adding a new type of Experiment
Generating Predictions
Utility Functions
Python 2 and 3 Compatibility
Persistence of objects and data
Batched Data Writing
Frequently Asked Questions
Installation
Running
Contributing
API Reference
Architecture
API Functions
CPU
Cudanet GPU
Nervana GPU
Nervana Hardware
MLP
Autoencoder
Balance Network
RBM
DBN
Recurrent Neural Network
Cost Layer
Activation Layer
Data Layer
Weight Layer
Fully Connected Layer
Convolutional Layer
Pooling Layers
DropOut Layer
Composite Layers
Normalized Layers
Boltzmann Layers
Recurrent Layers
Gradient Descent
Value Initialization
Misclassification
ROC
Loss
Squared Error
Activation Functions
Cost Functions
MNIST
CIFAR10
Iris
Sparsenet
Mobydick
Synthetic
Indices and tables
¶
Index
Module Index
Search Page
Table Of Contents
neon: A highly configurable deep learning framework
Contents
Indices and tables
Next topic
Quick start
This Page
Show Source
Quick search
Enter search terms or a module, class or function name.
Navigation
index
modules
|
next
|
neon 0.8.1 documentation
»