Description
Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. DJL is designed to be easy to get started with and simple to
use for Java developers. DJL provides a native Java development experience and functions like any other regular Java library.
You don't have to be machine learning/deep learning expert to get started. You can use your existing Java expertise as an on-ramp to learn and use machine learning and deep learning. You can
use your favorite IDE to build, train, and deploy your models. DJL makes it easy to integrate these models with your
Java applications.
Because DJL is deep learning engine agnostic, you don't have to make a choice
between engines when creating your projects. You can switch engines at any
point. To ensure the best performance, DJL also provides automatic CPU/GPU choice based on hardware configuration.
DJL's ergonomic API interface is designed to guide you with best practices to accomplish
deep learning tasks.
Deep Java Library (DJL) alternatives and similar libraries
Based on the "Machine Learning" category.
Alternatively, view Deep Java Library (DJL) alternatives based on common mentions on social networks and blogs.
-
Deeplearning4j
Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running math code and a java based math library on top of the core c++ library. Also includes samediff: a pytorch/tensorflow like library for running deep learn... -
Oryx 2
DISCONTINUED. Oryx 2: Lambda architecture on Apache Spark, Apache Kafka for real-time large scale machine learning -
DatumBox
Datumbox is an open-source Machine Learning framework written in Java which allows the rapid development of Machine Learning and Statistical applications. -
Weka
DISCONTINUED. Collection of algorithms for data mining tasks ranging from pre-processing to visualization.
CodeRabbit: AI Code Reviews for Developers

* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of Deep Java Library (DJL) or a related project?
README
[DeepJavaLibrary](website/img/deepjavalibrary.png?raw=true "Deep Java Library")
Deep Java Library (DJL)
Overview
Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. DJL is designed to be easy to get started with and simple to use for Java developers. DJL provides a native Java development experience and functions like any other regular Java library.
You don't have to be machine learning/deep learning expert to get started. You can use your existing Java expertise as an on-ramp to learn and use machine learning and deep learning. You can use your favorite IDE to build, train, and deploy your models. DJL makes it easy to integrate these models with your Java applications.
Because DJL is deep learning engine agnostic, you don't have to make a choice between engines when creating your projects. You can switch engines at any point. To ensure the best performance, DJL also provides automatic CPU/GPU choice based on hardware configuration.
DJL's ergonomic API interface is designed to guide you with best practices to accomplish deep learning tasks. The following pseudocode demonstrates running inference:
// Assume user uses a pre-trained model from model zoo, they just need to load it
Criteria<Image, Classifications> criteria =
Criteria.builder()
.optApplication(Application.CV.OBJECT_DETECTION) // find object dection model
.setTypes(Image.class, Classifications.class) // define input and output
.optFilter("backbone", "resnet50") // choose network architecture
.build();
Image img = ImageFactory.getInstance().fromUrl("http://..."); // read image
try (ZooModel<Image, Classifications> model = criteria.loadModel();
Predictor<Image, Classifications> predictor = model.newPredictor()) {
Classifications result = predictor.predict(img);
// get the classification and probability
...
}
The following pseudocode demonstrates running training:
// Construct your neural network with built-in blocks
Block block = new Mlp(28 * 28, 10, new int[] {128, 64});
Model model = Model.newInstance("mlp"); // Create an empty model
model.setBlock(block); // set neural network to model
// Get training and validation dataset (MNIST dataset)
Dataset trainingSet = new Mnist.Builder().setUsage(Usage.TRAIN) ... .build();
Dataset validateSet = new Mnist.Builder().setUsage(Usage.TEST) ... .build();
// Setup training configurations, such as Initializer, Optimizer, Loss ...
TrainingConfig config = setupTrainingConfig();
Trainer trainer = model.newTrainer(config);
/*
* Configure input shape based on dataset to initialize the trainer.
* 1st axis is batch axis, we can use 1 for initialization.
* MNIST is 28x28 grayscale image and pre processed into 28 * 28 NDArray.
*/
trainer.initialize(new Shape(1, 28 * 28));
EasyTrain.fit(trainer, epoch, trainingSet, validateSet);
// Save the model
model.save(modelDir, "mlp");
// Close the resources
trainer.close();
model.close();
[Getting Started](docs/quick_start.md)
Resources
- [Documentation](docs/README.md#documentation)
- DJL's D2L Book
- JavaDoc API Reference
Release Notes
- 0.19.0 (Code)
- 0.18.0 (Code)
- 0.17.0 (Code)
- 0.16.0 (Code)
- 0.15.0 (Code)
- 0.14.0 (Code)
- 0.13.0 (Code)
- 0.12.0 (Code)
- 0.11.0 (Code)
- 0.10.0 (Code)
- 0.9.0 (Code)
- 0.8.0 (Code)
- 0.6.0 (Code)
- 0.5.0 (Code)
- 0.4.0 (Code)
- 0.3.0 (Code)
- 0.2.1 (Code)
- 0.2.0 Initial release (Code)
The release of DJL 0.20.0 is planned for October or November 2022.
Building From Source
To build from source, begin by checking out the code. Once you have checked out the code locally, you can build it as follows using Gradle:
# for Linux/macOS:
./gradlew build
# for Windows:
gradlew build
To increase build speed, you can use the following command to skip unit tests:
# for Linux/macOS:
./gradlew build -x test
# for Windows:
gradlew build -x test
Importing into eclipse
to import source project into eclipse
# for Linux/macOS:
./gradlew eclipse
# for Windows:
gradlew eclipse
in eclipse
file->import->gradle->existing gradle project
Note: please set your workspace text encoding setting to UTF-8
Community
You can read our guide to [community forums, following DJL, issues, discussions, and RFCs](docs/forums.md) to figure out the best way to share and find content from the DJL community.
Join our slack channel to get in touch with the development team, for questions and discussions.
Follow our twitter to see updates about new content, features, and releases.
关注我们 知乎专栏 获取DJL最新的内容!
Useful Links
License
This project is licensed under the [Apache-2.0 License](LICENSE).
*Note that all licence references and agreements mentioned in the Deep Java Library (DJL) README section above
are relevant to that project's source code only.