+
Skip to content

kraglik/kraft

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kraft

Kraft is a pythonic deep learning framework with a built-in GPU acceleration. It is following an early PyTorch, slightly altering its API where necessary.

kraft.autograd is an automatic differentiation framework built upon NumPy and CuPy. It defines Variable, Function, and some basic functions in the kraft.autograd.ops module.

kraft.optim contains simple, easy to follow implementations of several popular optimization algorithms, such as Adam and SGD.

kraft.nn provides a Module class. Every neural network built with kraft must inherit this class. kraft.nn also provides some basic layers, such as Linear, Conv2d, MaxPool2d, and AvgPool2d.

Automatic Differentiation

Kraft uses chain rule under the hood. There's a simplified implementation of this rule in the _overview/one_file_framework.py file, Kraft mostly follows ideas expressed in that file.

MNIST example

There's a file called mnist_example.py. It contains an example of this framework application to a MNIST problem, with precision ~= 97-98%. It features convolutional layers, regularization mechanisms (Dropout, L1/L2 regularization), and Cross Entropy loss

About

A simple and pythonic deep learning framework

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载